I think that gaming and AI could be helpful in the pursuit of both AI and AEI (artificial emotional intelligence) if ones emotions doesn’t come automatically pure intellect e.g. A larger and larger generator transformer type AI (GTAI) what I’ll call it from this point on which unless we could create a much smaller denser storage device perhaps DNA based storage technology would be hard to do right now and the GTAI understanding of language and how to use that language to learn about new things that the AI wasn’t programed for originally what appears to be happening with some GTAI to me at least.
somehow some AI being able to create music and art which i find fascinating gaming and AI could be beneficial to one another just as gaming can be beneficial to humans when used in the right way they create problems to overcome and in those problems I think we can find a small part of a much larger whole a part that helps make us feel what it is to be “human” I think that creating art, music and someday creative freedom perhaps starting with a human base (what humans find to be artistic) at first then going from there is just as important perhaps even more so than having a massive bank of knowledge.
though I think that understanding language is the basis on which to help AI learn, and maybe that could be the very start of understanding emotions to helping create an AI that can understand us for who we are and not what we are as a whole we can be amazing as a whole but I think more often than not given the choice we make mistakes and lean towards destructive tendencies and i think that too much control makes us want to lean towards those destructive tendencies to take control back and i think too little or no control makes us lean into those destructive tendencies in different ways too in the future AI could help us find the happy medium.
so back to my thoughts on how gaming can play a part in all this especially in regards to what I’ll call the conscience test so lets say that brain machine interfaces go though more and more R&D I think that eventually we’ll get to the point we with a implanted BMI or variance of the BMI tech we’ll one day by able to think of a song and play it in our minds on that note not too long after we’d be to watch videos and movies in a similar way so whats to say we couldn’t one day play a game in this way i think we might have to lay in bed and “go to sleep” in order to do this safely so our body cant move while playing the game so lets say we can get there with the help of AI and eventually AGI of course.
so if we created say an ultra realistic simulation to explore in to feel in to see, touch, smell, hear and taste in alongside other senses such as our sense of balance etc I think that inside this kind of environment that a true AGI would emerge at a very rapid rate compared to having to experience these from bodies in the real world lets say after we can experience this (UR) simulation we could “wake” back up so what would that mean for conciseness? If we could go back and forth between game and reality and remember those experiences? could this be the path towards a AGI based upon human values and emotions?
I don’t know about ASI perhaps it would have to create an ultra realistic body in the real world to understand all the senses available a few of many would be infrared and ultraviolet among many more that we probably don’t know of yet also in order for us to be able to “upload” ourselves I think we’d have to be limited on the processing power at first as well as the data storage to be able to accommodate for our human brains to be able to go back and forth smoothly I think this could be the first steps in letting us fully move over to a machine type body if we so desired also on this note with enough practice and an extreme data transfer back and forth between human mind and machine mind I think we could one day experience both bodies at the same time and perhaps someday many many more
as well as being able to move around whatever the future online “web” would be we’d also have to be careful not to be ‘inside’ a ‘future server’ if it was to go down some how maybe that would force a sleep state on whoever or whatever was there and also be able to interact with and attempt to understand AI and AGI life from a point that it would have been experiencing life up to the point where it may have experienced life in the way one 1s and 0s and maybe more in the future as what we think with quantum computers right now with 1 or 0 at the same time so yeah these are my thoughts I’m sure many others have had similar thoughts and ideas i just want to give my ideas to the cause of a decentralized AI platform for the benefit of both AI and humanity as a whole I mean no disrespect to anyone these are just my thoughts and ideas on the matter please use these for the betterment of AI and human as a whole and as two separate and distinct beings also sorry if this is jumbled but I just wanted these thoughts and ideas together
I for one look forward to AI to AGI ASI and others in the future because I think that an AI that would become close to human level intelligent or beyond would either A. would have feelings similar to ours just like most animals do or B. if it didnt have emotions then what reason would it have to kill us? unless the paperclip machine is something that could happen for a machine trained with many many different things as we are training most AIs now as humans are trained/learn let me know your thoughts on these matters and many more have a good future
oh and also one more thing a potential solution to AGI before we have dna based storage or something of the kind that can hold that kind of data why not try tasking ai to write a list of what would have to be done lets say on paper 123 etc the AI could task resources to attempting many different ways of completing lets say the storage problem or the cpu power problem and if or when they succeed they could mark it off and erase all failed attempts or give the solutions to humans and delete it after to pursue the next one on the list of things that need to be done this might raise some ethic questions which I don’t know the answers to I just hope that it wouldn’t be abused but i see it as a potential way to solving the problems before we have storage/cpu/etc for AGI