Curated by THEOUTPOST
On Thu, 7 Nov, 12:03 AM UTC
3 Sources
[1]
AI injected into Minecraft and left alone is now teaching itself how to play
Researchers have injected an artificial intelligence program into an instance of Minecraft and left it to its own devices. The AI is now teaching itself how to survive. The AI was developed by the company SingularityNET and the Artificial Superintelligence Alliance (ASI Alliance). The AI has been named Autonomous Intelligent Reinforcement Inferred Symbolism, or AIRIS for short. According to reports the AI essentially started from nothing within Minecraft and over time slowly taught itself how to play using nothing but the game's feedback loop. The researchers explained how the AI is able to play Minecraft. The AI is given two types of environmental inputs and a list of actions it can perform. The first input is a 5 x 5 x 5 3D grid that blocks names that surround the AI agent. The researchers say this is how the AI is able to "see" the world around it. The second input is the current coordinates of the AI agent. The list of actions the AI has available to it is movement-based, which it can perform in one of eight directions - the four directions (forward, back, left, and right) and then diagonally for a total of 16 "actions". "The agent begins in 'Free Roam' mode and seeks to explore the world around it. Building an internal map of where it has been that can be viewed with the included visualization tool. It learns how to navigate the world and as it encounters obstacles like trees, mountains, caves, etc. it learns and adapts to them. For example, if it falls into a deep cave, it will explore its way out. Its goal is to fill in any empty space in its internal map. So it seeks out ways to get to places it hasn't yet seen." The researchers say future iterations of the AI will have more actions, such as being able to place blocks, collect resources, fight monsters, and crafting. "If we give the agent a set of coordinates, it will stop freely exploring and navigate its way to wherever we want it to go. Exploring its way through areas that it has never seen. That could be on top of a mountain, deep in a cave, or in the middle of an ocean. Once it reaches its destination, we can give it another set of coordinates or return it to free roam to explore from there."
[2]
AIRIS is a learning AI teaching itself how to play Minecraft
Join our daily and weekly newsletters for the latest updates and exclusive content on industry-leading AI coverage. Learn More A new learning AI has been left to its own devices within an instance of Minecraft as the artificial intelligence learns how to play the game through doing, says AI development company SingularityNET and the Artificial Superintelligence Alliance (ASI Alliance). The AI, named AIRIS (Autonomous Intelligent Reinforcement Inferred Symbolism), is essentially starting from nothing inside Minecraft to learn how to play the game using nothing but the game's feedback loop to teach it. AI has been set loose to learn a game before, but often in more linear 2D spaces. With Minecraft, AIRIS can enter a more complex 3D world and slowly start navigating and exploring to see what it can do and, more importantly, whether the AI can understand game design goals without necessarily being told them. How does it react to changes in the environment? Can it figure out different paths to the same place? Can it play the game with anything resembling the creativity that human players employ in Minecraft? VentureBeat reached out to SingularityNET and ASI Alliance to ask why they chose Minecraft specifically. "Early versions of AIRIS were tested in simple 2D grid world puzzle game environments," a representative from the company replied. "We needed to test the system in a 3D environment that was more complex and open ended. Minecraft fits that description nicely, is a very popular game, and has all of the technical requirements needed to plug an AI into it. Minecraft is also already used as a Reinforcement Learning benchmark. That will allow us to directly compare the results of AIRIS to existing algorithms." They also provided a more in-depth explanation of how it works. "The agent is given two types of input from the environment and a list of actions that it can perform. The first type of input is a 5 x 5 x 5 3D grid of the block names that surround the agent. That's how the agent "sees" the world. The second type of input is the current coordinates of the agent in the world. That gives us the option to give the agent a location that we want it to reach. The list of actions in this first version are to move or jump in one 8 directions (the four cardinal directions and diagonally) for a total of 16 actions. Future versions will have many more actions as we expand the agent's capabilities to include mining, placing blocks, collecting resources, fighting mobs, and crafting. "The agent begins in 'Free Roam'"' mode and seeks to explore the world around it. Building an internal map of where it has been that can be viewed with the included visualization tool. It learns how to navigate the world and as it encounters obstacles like trees, mountains, caves, etc. it learns and adapts to them. For example, if it falls into a deep cave, it will explore its way out. Its goal is to fill in any empty space in its internal map. So it seeks out ways to get to places it hasn't yet seen. "If we give the agent a set of coordinates, it will stop freely exploring and navigate its way to wherever we want it to go. Exploring its way through areas that it has never seen. That could be on top of a mountain, deep in a cave, or in the middle of an ocean. Once it reaches its destination, we can give it another set of coordinates or return it to free roam to explore from there. "The free exploration and ability to navigate through unknown areas is what sets AIRIS apart from traditional Reinforcement Learning. These are tasks that RL is not capable of doing regardless of how many millions of training episodes or how much compute you give it." For game development, a successful use-case for AIRIS may include automatic bug and stress tests for software. A hypothetical AIRIS that can run across the entirety of Fallout 4 could create bug reports when interacting with NPCs or enemies, for example. While quality assurance testers would still need to check what the AI has documented, it would speed along a laborious and otherwise frustrating process for development. Moreover, it is the first step in a virtual world for self-directed learning for AI in complex, omni-directional worlds. That should be exciting for AI enthusiasts as a whole.
[3]
SingularityNET, ASI alliance launch self-learning proto-AGI in Minecraft
SingularityNET, a decentralized artificial intelligence network, and the ASI Alliance, a coalition advancing artificial superintelligence, have launched the first self-learning proto-AGI within Minecraft. Unlike typical game AI, the new proto-AGI can adapt, navigate, and create rules based on real-time experiences, marking a leap for artificial general intelligence (AGI) development. AIRIS, short for Autonomous Intelligent Reinforcement Inferred Symbolism, functions independently of pre-set rules, allowing it to evolve as it encounters new challenges and stimuli. Deploying AIRIS in Minecraft, the first proto-AGI that can learn and adapt autonomously, will potentially open the door to applications in robotics, automation, and smart systems that solve real-time problems. AIRIS integration into Minecraft represents a new approach to AI and gaming as it independently refines its own "rule set" based on in-game experiences. A press release shared with Cointelegraph explained that the proto-AGI adapts its pathfinding, navigation, and obstacle-avoidance strategies in real time, generating new rules when faced with unfamiliar situations. This practicality of AIRIS aligns well with Minecraft's open-ended, unpredictable sandbox world, providing an expansive environment for which the proto-AGI can test the limits of autonomous AI learning. Related: AI will save us all, but only if it's decentralized -- SingularityNET CEO Future applications SingularityNET and the ASI Alliance view the integration of AIRIS into Minecraft as a controlled test for future applications that involve adaptive, real-time learning. By refining AIRIS in the complicated digital environment of Minecraft, the proto-AGI aims to take on real-world challenges that necessitate independent and contextual problem-solving. An ASI Alliance spokesperson told Cointelegraph that the team needed to "evaluate the system in a more complex and open-ended 3D setting." "Minecraft is an ideal fit for this purpose -- it's widely popular, meets all the technical requirements to integrate AI, and is already used as a benchmark for Reinforcement Learning. This will enable us to directly compare AIRIS's performance with that of existing algorithms." Related: ASI Alliance can overcome OpenAI in hardware -- SingularityNET CEO Singularity CEO insights on AGI and decentralization On episode 46 of The Agenda podcast, the Cointelegraph team hosted Ben Goertzel, the CEO of SingularityNET and the ASI Alliance, to separate fact from fiction surrounding AGI. Goertzel argued that AI needs to run through decentralized processes "for the good of humanity," adding that this is what "SingularityNET was designed to provide." He said large language models (LLMs) like OpenAI's ChatGPT can perform general tasks but aren't as good of a quality as AGI because they don't venture beyond their training. The Singularity and ASI Alliance CEO defined AGI as an AI that can do "everything that people can do, including the human ability to leap beyond what we've been taught."
Share
Share
Copy Link
SingularityNET and ASI Alliance have introduced AIRIS, a proto-AGI system that autonomously learns to navigate and adapt within Minecraft, marking a significant step towards artificial general intelligence.
SingularityNET and the Artificial Superintelligence Alliance (ASI Alliance) have unveiled AIRIS (Autonomous Intelligent Reinforcement Inferred Symbolism), a groundbreaking proto-AGI system designed to autonomously learn and adapt within the popular game Minecraft 1. This innovative AI represents a significant leap towards the development of artificial general intelligence (AGI), as it can evolve and create its own rules based on real-time experiences without relying on pre-set instructions.
AIRIS operates within Minecraft using two primary types of environmental inputs:
The AI can perform 16 different movement-based actions, including moving in eight directions (four cardinal directions and diagonally) 2. This setup enables AIRIS to explore and navigate the complex 3D environment of Minecraft, learning to overcome obstacles such as trees, mountains, and caves as it encounters them.
What sets AIRIS apart from traditional reinforcement learning algorithms is its ability to engage in free exploration and navigate through unknown areas. The AI builds an internal map of its surroundings and actively seeks to fill in any empty spaces, demonstrating a level of curiosity and adaptability not typically seen in AI systems 2.
AIRIS can operate in two modes:
The researchers behind AIRIS plan to expand its capabilities in future iterations, including the ability to place blocks, collect resources, fight monsters, and engage in crafting activities 1. This expansion will further test the AI's ability to adapt and learn in increasingly complex scenarios.
The development of AIRIS has significant implications beyond gaming. It could potentially lead to applications in robotics, automation, and smart systems capable of solving real-time problems in unpredictable environments 3. The AI's ability to refine its own "rule set" based on experiences makes it a promising candidate for tackling real-world challenges that require independent and contextual problem-solving.
Ben Goertzel, CEO of SingularityNET and the ASI Alliance, emphasizes the importance of decentralized processes in AI development "for the good of humanity" 3. He distinguishes AIRIS and similar proto-AGI systems from large language models like ChatGPT, noting that true AGI should be able to "leap beyond what we've been taught," a capability that AIRIS is striving to achieve.
As AIRIS continues to evolve and learn within the Minecraft environment, it serves as a controlled test for future applications that require adaptive, real-time learning. This project not only pushes the boundaries of AI in gaming but also represents a significant step towards the development of more versatile and intelligent AI systems capable of tackling complex, real-world challenges.
Reference
[2]
AI startup Altera's Project Sid demonstrates the emergence of sophisticated social structures, job specialization, and even religious beliefs among AI-controlled characters in Minecraft, showcasing potential applications for large-scale societal simulations.
2 Sources
Etched and Decart unveil Oasis, an AI-powered Minecraft-like game that generates gameplay in real-time, sparking discussions about the future of AI in gaming and its implications.
9 Sources
Artificial intelligence has successfully recreated the iconic game DOOM, marking a significant milestone in AI-driven game development. This achievement showcases the potential of AI in creating playable game environments without traditional coding.
5 Sources
As generative AI makes its way into video game development, industry leaders and developers share their perspectives on its potential impact, benefits, and challenges for the future of gaming.
3 Sources
Google researchers have achieved a significant milestone in AI technology by creating a model that can simulate the classic game DOOM in real-time, without using a traditional game engine. This breakthrough demonstrates the potential of AI in game development and simulation.
7 Sources
The Outpost is a comprehensive collection of curated artificial intelligence software tools that cater to the needs of small business owners, bloggers, artists, musicians, entrepreneurs, marketers, writers, and researchers.
© 2024 TheOutpost.AI All rights reserved