Cernisoft Gaming

Sony patents method to create music in video games with the emotion of the player

Sony patents method to create music in video games with the emotion of the player

sony-ps5-playstation

Sony patents method to create music in video games with the emotion of the player

Artificial intelligence (AI) is advancing at an interesting rate and in recent years it has been very useful in different areas. The future of this technology is attractive for the video game industry due to the great impact it could have on the development process. Although it bases its operation on numerical data and analysis, technology could also have great importance in the artistic field, which would offer unlimited possibilities. Sony knows this very well and as proof of this is a recent patent that describes a model capable of linking users’ emotions with music and representing it in a video game.

The registration of the patent in question was made a few months ago, but until now it was made public through the World Intellectual Property Organization. In it, Sony describes the operation of “a method and system for dynamic music creation”. According to the information, said system would be able to identify an emotion and assign it to one or more musical elements, in addition to linking it to a “game vector”.

The system would be able to map emotions thanks to the use of an artificial intelligence model. First, musical information would be collected and then musical and emotional metadata would be generated, which would be linked thanks to mapping. In addition, information from games would be collected, such as scenes and other components, which would also be mapped with emotional metadata. Thus, with this information, characters and scenes would then be composed within the game. In the end, this information would optionally be linked to the user’s personality or behavior and through various iterations the game would be interacted with.

In another part of the document, it is read that the mechanism would analyze music and separate it from components such as rhythm, tempo, melodic structure, modality, melodic structure, timbre, etc. These elements are those that would be mapped with the emotional components and would be represented in a game by means of characters, enemies or even activities or the user’s personality (game vector). The resulting music or melody would be projected on the game vector according to the emotion, to ultimately generate a musical composition based on the game vector and the user’s emotions.

Technical and somewhat ambiguous language is often used in this type of document, so sometimes it can be somewhat complicated to see how this could be implemented in practice. In addition, we tell you that the patent did not share images of how the system could work, but only flow diagrams.

We remind you that this is a patent, so it may never be seen in practice. However, the idea of ​​being able to interact in that way in a video game is interesting.