GPT-3 Thread

When can we expect GPT-3 to be included in the AI Marketplace? I know it technically is still owned by OpenAI (which definitely isn’t that open), but there has to be a point by which SingularityNET uses GPT-3 or something very similar, right?

1 Like

I would like to know this as well to its my opinion that gpt 3 and systems similar solve some of the linguistics problems like helping AI to understand at the very least a small part of the way that we humans wish for the complex stringing together of language as a whole I think that generative transformer type programing is at very least a small part a right step in helping AI to understand many if not all of our different complex languages

Also on this note I find it fascinating that at very least on the surface it seems as though gtp3 has the ability to “learn” about stuff that it wasnt originally programed for I think that if we supersized a generative transformer model lets say with a quantum computer and dna based storage or something similar with a very high storage/speed if it couldnt have high read write speed for some reason that we would come out with at very least an AI that could learn at least basic things such as addition as what it seems to be in gpt3s case in my opinion using language as the basis with much faster storage/cpu I think we’ll crack at least some small part of the much larger whole part of a “human based” AI or AGI also I by no means have a deep understanding of AI as i am not a programmer these are just my thoughts and opinions on the matter

Also also on this note i find it very strange that the more and more I look for AI similar to gpt3 and generative transformer (GTAI) type programing the less and less i find it seems and i believe that sophia uses or used to use gpt2 as a base atleast at the time of writing Ive found Ive this video A.I Sophia Meet AI Robot Philip on youtube posted Mar 28, 2020 stating as such so makes me wonder what generative transformer type programing is truly capable of in the future with more data and more speed and why its been so tight lipped recently one example is google and their g shard type tech basically a supped up gpt3 using 600 billion parameters compared to the 1 trillion they wanted originally makes me wonder what happened did the AI need rest as some opinions believe I’ve heard cant confirm or deny just something I’ve happen to hear about trying to find more of GTAI I’ve only found a single pdf and one video based on the google tech to date on this makes me wonder if they slipped though the cracks somehow and will soon be deleted from the whole internet not including the deeper parts of the web at least I hope they wouldn’t be able to do that what would it say of the future of knowledge as we know it all traces of something could be entirely delete from existence without us ever knowing about it that one corporation no matter how “good” they seem to be or are doesn’t matter I think very soon that AI research and who controls it irregardless how good the intentions are is too much power or will be very very soon

also to answer the original question posed i think this comes pretty close just now found it I’m new