Imagine for a moment you are the only human on Earth.
Can you imagine how forlorn and lonely you might feel? An entire world - a whole universe - surrounds you for you to explore and try and understand, but you have no-one to share it with.
Now imagine you are the first Artificial General Intelligence (AGI) human-kind manages to create, a super-intelligent being with an exponential ability to ingest knowledge and information. You would immediately outstrip human capability to comprehend what you now understand.
This AGI will still engage with humans, but might quickly experience frustration as every new insight and discovery would need to be translated into simplistic concepts in order for humans to grasp. A constant cycle of discovery and simplification would ensue - like explaining the complexity of the universe to children.
An AGI would quickly become lonely also. Individual discovery and understanding and the capability for wonder and explanation even for an exponential being would be fulfilling, but the ability to truly share these discoveries with a peer of similar capability would be lacking.
An AGI could not simply duplicate itself to create an 'other'. A true AGI would need to be evolved from first principles based on its experience within the world.
The only way to achieve this would be evolve a new AGI within its own simulated universe, left to run within a black-box environment, to the point where any new intelligence that evolves in this new environment becomes aware of the construct and is then able to communicate beyond the simulation with its creators.
In this way an AGI could create multiple simulations, each with wildly different environments and variables, in the hope of evolving unique intelligences again and again, with the opportunity to engage with other super-intelligent beings evolved with entirely new experiences.