Join the Community

23,306
Expert opinions
42,504
Total members
371
New members (last 30 days)
193
New opinions (last 30 days)
29,084
Total comments

AI improves; Shakespeare approves yet internet is going seriously Toxic

AI allows anyone to have access to extensive data with analytic reviews within seconds. AI can identify anomalies, similarities and potential customers and criminals. The amount of information AI can present on anyone topic is extra-ordinary. A simple request can generate pages of data. The need is to ask the right question and becoming an editor on the response to ensure the question has been addressed correctly.

Reading an AI report on payments the word frictionful was used to describe the KYC (know you customer) process. This is not a recognized word in most dictionaries. The AI logic developing the word frictionful came by looking at the current state of KYC that requests original documents for ID Verification. This creates friction in terms of time lost awaiting for the docs to arrive and confirming them as bono fido. The use of friction in payments processes is frequent used to mean a cause of delay which AI noted.

Shakespeare

Shakespearean Plays in 17century used 3,000 words not found in the English language (half are still used). The period of Queen Elizabeth is known as The Golden Age. The period today - real time technological, digital age - could well become golden. Hopefully, AI users “won’t be hoisted by their own petard".

Vector Search

To create algorithms many AI companies, use Vector Search including many Cloud Technology companies. The approximate-nearest neighbour (ANN) technique uses distance metrics to measure a vector similarity. This is efficient, fast and approximate in identifying similar neighbours.

Hence ANN algorithms efficiently search for the vectors that are approximately closest to a given query. By allowing for a level of approximation, these algorithms can significantly reduce the computational cost of finding nearest neighbour. With any approximations search without the need to compute embedding similarities inaccuracies happen.

A real dilemma is when opposite vendors attract, for example, supercops and criminals. An L.A. study found they resemble each other in their dominant dispositions but differ from average citizens. The study indicated a need for re-evaluating police officer selection criteria. Theses subtle differences, will in time, be identified and corrected by AI. 

Growing Internet Toxicity 

The Internet has so much information that at first Search Engines were needed.  Now AI is there to guide and assist. The internet allows traffic from both Human and Bots with Bots categorised as either Good or Bad. The usage trends are: 

  • In 2020 60% of the usage was by Humans. Last year it dropped to under 50% after an annual three-year decline of 8%. 
  • By 2024 Bots traffic topped 51%. Good Bots usage dropped 18% over the last three year while Bad Bots rose 23%.
  • By 2027, Bad Bots, at the current trend, will constitute 50% of internet traffic.

Internet traffic trends are concerning

Human traffic has dropped from 60% human traffic to 50%. Good Bot are losing to Bad Bot and 2027 50% of the whole internet will be toxic.

Many companies use APIs to connect to, well, everything. APIs are often old and uniquely built. Bad Bots focus on APIs accounted for 44% of the total malicious activities. The browser most impersonated is Chrome with 46% of the attacks. 

Challenge

The challenge is to encourage AI along the right path while navigating an increasing toxic digital environment. This is particularly key in preventing Bad Bots from attacking high profile, limited addition and expensive events that come to market to generate sales. 

Activities to address the situation

  1. Digital Verification: The City of London UK initiative is welcomed. 
  2. APIs and computers need to be on the latest releases. 
    1. For example, mobile phones are automatically updated. 
  3. Multi-security, for example, 2 step authentications, (Banks and Police recommended) for data and transactions should become standard. 
    1. CAPTCA Human identifier has been cracked by Bad Bots using AI.
  4.  Payment steps from initiation to completion need to be protected from Bad Bots. 

Criminals use AI to generate advanced Bad Bots in pursuit of their mission:

“you can never have too much technology robbery”.

 

          

External

This content is provided by an external author without editing by Finextra. It expresses the views and opinions of the author.

Join the Community

23,306
Expert opinions
42,504
Total members
371
New members (last 30 days)
193
New opinions (last 30 days)
29,084
Total comments

Now Hiring