The first credit cards in the United States were pretty safe. Arguably, the first credit cards date all the way back to the late 1930s. A conglomerate of gas stations decided to revolutionize the traditional way that customers put credit on accounts at storefronts. They all agreed to share customer accounts at different locations, and each customer would have a unique card with an identifying number to show in order to keep track of the accounts across locations. The basic idea of the card was born.
These cards were limited, and fraud could happen. When fraud did occur, it was comparable to fraud elsewhere. A family member or friend could lie to the clerk and claim that they had permission to use the card when they did not. This basic form of identity theft had limited potential for damage, and it was up to each individual clerk’s discretion as to whether or not a “charge” should be authorized.
Credit and charge cards slowly gained popularity across the United States for the next several decades. Then, the 1990s hit with a vengeance.
Visa, MasterCard, and the DotCom Boom
In about a decade, credit was transformed from simple and frequently store-specific transactions to a worldwide phenomenon. The introduction of the World Wide Web to the public in 1992 opened new possibilities for payment processing online and offline. Merchants quickly jumped at the ability to swipe credit cards in storefronts, and major online payment processing giants such as PayPal were launched. The entire game changed. Payments became quicker and more convenient.
Additionally, the introduction of major credit cards made e-commerce viable. Instead of selling to an audience with limited computer literacy, merchants could have almost anyone recite a credit card number over the phone or type it in a system. Other advancements such as one-click payment options made e-commerce easier for merchants and consumers. Plus, consumers could ditch banks in favor of using credit cards in everyday transactions. No one really thought about how any of this could go wrong aside from theft comparable to cash theft.
How Credit Processing Did Go Wrong and Continues to Go Wrong
Suddenly, swiping a credit card became less convenient. Major banks and credit companies started to embed small chips in cards. After all, tapping a card should be even more convenient for customers waiting in long lines and merchants eager to push long lines forward. However, criminals figured out that they could acquire the information from chips without leaving a single fingerprint on a stolen card. It was a brilliant and highly dangerous realization.
Literally Robbed Blind While Standing in Line
Currently, consumers can literally be robbed of sensitive information and have no idea while standing in lines or standing in crowded places. A few interesting questions suggest the industry has taken a huge step backward, and it has sacrificed consumer protection for unnecessary convenience.
• Does it really take too long to swipe a card?
• What are the benefits of payment by tapping a card with an embedded chip?
• Why do most major banks refuse to send customers cards without chips?
The Bottom Line
Credit card processing has opened many new opportunities in a short period of time. However, one question remains:
Has the industry gone too far?