Here are some of the red flags that we need to take into consideration regarding NFT
NFT is on a clear path of conflict with artificial intelligence, which in theory would unlock all new utilities, including AI-managed avatars that can carry on conversations.
However, with few people understanding how this type of AI works, it is very easy to deceive buyers. Recently, several so-called smart or smart projects with irreplaceable tokens have started to dig into the free digital burger, promoting it as an innovative “AI”, gaining approval from influential investors and breaking resale records. The market will soon be flooded with “AI-NFT”, most of which are completely, positive, NGMI (they will not succeed). Here are five red flags to look out for:
- This is a cheap fake.
- Using GPT-3 or having other third-party dependencies
- AI is "on the roadmap" (and everywhere in marketing collateral)
- Team has no AI experience
- Roses with different names smell... like tulips
This is a cheap fake
If you've never seen a moving portrait with deep falsehood, this is the first time you'll ever feel some serious magic come to life for Harry Potter. But this real magic is pretty much accessible to casual muggles via a number of free apps (like TokkingHeads, MyHeritage, Wombo) that let you turn any image into completely fake content based on the original video. So, if you're building, or already own, NFT and want to get it moving, there's an inexpensive way to do it yourself. Anything beyond minimal movement, such as B. blinking or breathing can destroy the illusion due to the inevitable inconsistency between the image and the main video source. So, if you see a mobile NFT that doesn't move much, it's almost certainly using ready-made software and not a special AI secret sauce.
These cheesy fakes are used to advertise things like portraits of historical figures who, when included in GPT-3, can talk to you in real time. The problem is, real-time conversations with AI require 3D CG models. So, if you come across an advertised deep fake with the ability to talk in real time, an alarm bell will sound because the deep fakes don't actually work independently. Instead, they rely on superimposing a person's face on an actor in a video during post-production (like Zuk in Trey Parker or Tom Cruise in a talented TikTok impersonator) or during a video call with, say, Deep Elon zoom bombardment of free software like Avatarify).
Using GPT-3 or having other third-party dependencies
The Generative Pre-trained Transformer 3 (GPT-3) language model is excellent at generating words and images in the style of a specific person or genre, with plenty of examples. Massive models of generative language – which are not only slow to respond and so expensive to learn that GPT-3, which was trained before COVID, were unaware of the virus' existence – are taught on the public internet, which is somewhat similar to what would be done drinking from the audience, the toilet. People are also usually toilet-mouth monsters when it comes to software (and each other) on the internet, but even people involved in GPT-3-based role-playing games are concerned when algorithms introduce children to sexual scenarios. For this reason, OpenAI is eager to limit the use of chatbots - and why any NFT project that claims to use GPT-3, especially to run the "Personalities" add-on, should raise an eyebrow.
After years of limited betas with long waiting lists, OpenAI has just announced that its GPT-3 API is now publicly available to developers. (Until and unless Microsoft - which funds and apparently licenses GPT-3) Open AI is shutting down. (So far, several early access innovations have used GPT-3 to create and sell a unique "personality" for your NFT. But now that the API is truly open to everyone, in fact anyone can have a GPT-3 supported Generate" personality" by providing some sample text. Another story is whether OpenAI allows your particular use case to stay active. Last summer, The Chronicle published a heartbreaking story about a man running a GPT-3 powered service that was used to bring his dead fiancé back. to live by the text. Months later, OpenAI shut down the service in heartbreaking circumstances without warning of violating its strict terms; Fiance Joshua Jessica died essentially twice. NFT GPT-3 is likely to suffer the same fate. NFT is ultimately for Verified Properties. If your AI is NFT chat from a third-party service that the seller doesn't understand own, develop, or control, so what's yours?
AI is "on the roadmap" (and everywhere in marketing collateral)
While the use of GPT-3 or any third-party reliance is a serious cause of lag, "AI" in marketing materials or project roadmaps without evidence of its existence in actual products is a red flag. AI is not an ingredient that anyone can add as long as they have more resources and more time. However, given the recent hype about some NFT AI projects, people are now almost as fast as AI projects as if adding a "metaverse" to the whole thing.
Team has no AI experience
When in doubt, the project team page usually shows if there is a prayer for an ambitious AI roadmap. Zero experience with artificial intelligence = big red flag.
Even teams with AI knowledge often struggle to achieve something. Take Little Sofia, for example, a miniature of Sofia's extraordinary humanoid robot that has been doing a lot of fundraising since 2019, with a steadily declining date on board (December 2022 at the time of writing) and increasingly angry and frustrated supporters. Sofia, who made headlines in May with the sale of an NFT that falsely claimed to be the first non-human digital art and whose creators said it was “essentially alive”, is a big part of the AI hype problem. (The NFT, which auctioned for $688.888, was actually created in partnership with real human artists, and was preceded over decades by AI digital artists like Harold Cohen's Aaron).
Unfortunately, these marketing tactics, including obtaining citizenship for Sofia Saudi Arabia, seem to be really working with the public. In 2017, the branch of the team behind Sofia raised 36 million US dollars in less than a minute in an initial coin offering to build a decentralized Artificial Common Intelligence or AGI on the blockchain. The last thing I checked, they - along with the tech giants and the best researchers in AI - haven't decided on AGI. Moral of the story: even if the team has experience with artificial intelligence, it can still fail, and if their primary experience is offering coins, marketing, or other quick cash programs first, it might be time to apply to Hills.
Roses with different names smell... like tulips
Speaking of marketing, the man who invented this analog rock for pets is a genius. (Also, a millionaire, having sold over a million pet pebbles for four dollars a piece.) The rock is just rock, with cute and rambling care and nutrition guides (spoiler alert: not required). But calling them "pets" in the same fashion as a baby beanie or, dare I say, bored apes. Specifically, the "first iNFT," which Sotheby's auctioned off for nearly half a million dollars, was actually just a chatbot attached to an avatar: a technology that's older than rock, but has been around since the 1960s. last century. Which is fine as long as the buyer knows that what they are buying is actually a personified chatbot with a different name.
Despite the packaging, chatbot-powered avatars - um - AI NFT are a promising "new" category (similar to a metaverse with "second life"). NFT wants to revolutionize digital ownership and put money where it belongs: back in the hands of creators. But even if the industry overcomes other barriers such as a high carbon footprint, we will only achieve mass acceptance if we scream deception and noise together to make quality work shine.