

Gaming
Brand new controllers for PlayStation 5 and its virtual reality headsets presented
After Sony announced the next generation VR headset for the PlayStation 5, the company unveiled the virtual reality controllers that will ship with it.
The controllers are uniquely shaped and comfortable to hold for people with both small and large hands. Most of the technology has been borrowed from the DualSense controller, but there are also innovations.
The VR controllers will be tracked by the headset using a “tracking ring” at the bottom of each controller. Sony has also optimized the haptic feedback system specifically for this form factor.
Immerse yourself with adaptive triggers, the L2 and R2 buttons from the DualSense controller. They are adaptive thanks to a small motor that allows them to adjust the feedback, so each action can feel different.
In addition, the controllers will be able to detect your fingers through the touch-sensitive areas where you place your thumb, index, and middle fingers. There are also other buttons on the controllers, in fact, each VR controller received half of the buttons from DualSense. On the left is the analog stick, as well as the triangle and square buttons, L1, L2, and the Create button. The right controller has another analog stick, cross and circle buttons, R1, R2, and an Options button. L1 and R1 are labeled as “grab” buttons that will be used to pick up items in games.
The release of these gadgets is not expected until next year.
…

Components
The GeForce RTX 40 won’t be retired just yet. Nvidia introduced Path Tracing SDK, but does not yet offer to integrate path tracing into games

Nvidia has released the Path Tracing SDK. This is a set of tools for developers to integrate path tracing technology into games.
Path tracing is a variant of ray tracing and has only been implemented in two games so far: Quake RTX or Portal RTX. The latter, thanks to these effects, has become too tough even for modern video cards, so it’s hard to say how much gamers need such technologies.
Nvidia itself notes two options for using the new technology:
- Create a reference path tracer to ensure game lighting during production matches reality, speeding up the iteration process.
- Create high-quality photo modes for RT-enabled GPUs or ultra-high-quality real-time modes that take advantage of the Ada Lovelace architecture
That is, so far even Nvidia itself does not offer to integrate effects directly into games, offering them for photo modes or for testing during game development. However, Cyberpunk 2077 will soon receive an update with path tracing, so everything will become clearer.
Gaming
Microsoft plans to launch a mobile game store to compete with Apple and Google

Microsoft intends to launch a new app store with games for smartphones based on Android and iOS. This will happen next year if regulators approve a deal to acquire Activision Blizzard for $68.7 billion. As Xbox head Phil Spencer said at the Game Developers Conference, this will allow Xbox and content to be offered both from Microsoft itself and from partners at any devices.
“We want to be able to offer Xbox and content from both us and our third party partners on any screen anyone wants to play.‘ Spencer explained. He clarified that at the moment this cannot be done on mobile devices, but the company wants to “build a world” in which “such devices will be open.”
Apparently, we are talking about new rules by which Apple and Google should allow the use of third-party app stores on their mobile platforms.
As for the deal between Microsoft and Activision Blizzard, this is opposed by Sony. If the deal does go through, it would allow Xbox developers to increase competition on the “biggest platform people play on,” smartphones, of course. After all, Microsoft plans to fill the lack of mobile games with Activision Blizzard projects.
Spencer has not yet announced the launch date of the store, so it remains to be seen.
Gaming
Now for game designers: ChatGPT in the Unity editor will help create games

A user named keijiro introduced a very interesting concept – the ChatGPT deep learning model in the Unity editor for creating games. With this system, the generation of game scenes and environments has become easier, since you only need to set a text query. The project is called AICommand.
At the same time, the author honestly admits that this is only a concept, not a finished product, the whole system does not always correctly understand requests, which will lead to errors in the game. However, in general, this is a rather curious tool that can potentially improve the work of game designers.
The author notes that Unity version 2022.2 or newer is required to work, no other requirements are given. The project itself is available on GitHub.
Note that earlier another generative one has already been able to replace musicians, creating music based on images.
-
News10 hours ago
Nvidia CEO Jensen Huang wants to run the company for another 30-40 years, and then become a robot and continue to hold this post
-
Gaming3 days ago
Build in 60 seconds: That’s how long it took GPT-4 to generate a Pong game in JavaScript
-
News6 days ago
Prices for the construction of new factories for the production of microcircuit chips are skyrocketing
-
News4 days ago
6000 mAh, 48 MP and big screen. Huawei Enjoy 60 battery monster launches March 23