Creator Jam Trailer, Text Rendering layout system, Metaverse Training Center
Hello everyone!
We have another weekly update for you. We’ve made more significant progress with the new text rendering system, it’s getting nearly complete now! We’re also devoting more resources to build the Metaverse Training Center a new starter user experience.
The Creator Jam has released their first trailer, showcasing more than 6 months of collaborative community creations.
We’ll be going live on Twitch in a bit, so feel free to join to hang out with us and ask any questions about Neos and its development!

Most of the development time has been focused on the new text rendering system that will become the foundation of the new UI. We have implemented a major subsystems for getting the text prepared for rendering: parsing of markup, shaping and layout (putting glyphs on lines, word wrapping, alignment, justification…).
This system takes a string of text as input and processes it in several stages, each one designed to be modular and extensible, so we can easily expand on each part of the system in the future to support more features and languages.
The first part will take the text and process it into a data structure in memory, automatically parsing various formatting tags. For example for bold, for changing color, for changing size or for changing alignment.

Once the text is parsed, glyphs are generated from the referenced font files and properly shaped (positioned relative to each other). This currently includes simple kerning and in the future will include more complex shaping rules (likely using the HarfBuzz library) for better quality and support of certain languages like Arabic.
The shaped text is then laid out into lines, automatically wrapping words onto a new line if they exceed the given region size. This ensures that the text will nicely wrap itself into the UI layout or whatever area you define. In the future this will be extended to support more complex shapes, for example having the text wrap around an embedded image or into a shape.

The laid out glyphs are then post-processed with additional layout rules and modifications - aligning the symbols to left, center, right or justifying them, resulting in the final positions of the glyphs.
The system was designed to handle the different stages in an abstract way in a data structure in the memory and so it can be updated on the fly - for example changing a font or size will only re-run some parts of the process, saving on performance.
The resulting data structure can then be read by different “renderers” - components that actually generate the resulting visual and the main remaining piece of the puzzle before the new text system is fully functional.
For testing purposes we have used a very primitive renderer that renders the glyphs into a Neos bitmap. It doesn’t provide high quality, but it’s sufficient to make sure the rest of the system works as intended. Here are some examples of it in action:
This demonstrates font fallback system. This is using one main font with two fallbacks (one for the Czech letters and one for Japanese). The system will automatically search the fallback fonts when a glyph is missing from the main one.
This demonstrates ability to select different fonts within the text using markup. The system automatically combines the different glyph metrics on a single line.
Using different alignment method for different parts of the text.
If you’re interested to learn more technical details about this new system, check out the #devlog channel on our Discord!
We’re also making progress on another of our major upcoming features, the “Metaverse Training Center”. This will serve both as new introductory experience for our users and learning resource, putting together all the tools and information to get users started with different features that Neos offers.
Currently we’re finishing two main rooms - the avatar room and the art room. In the avatar room users will be able to learn everything they can about setting up an avatar in Neos. They’ll be able to customize their own version of Neos Bot with simple UI, add accessories or easily create their own.

We have made a few other gadgets to help along - switchable mirror/portal and adjustable height meter. In the other rooms you can learn how to import own avatar models, setup dynamic bone chains and more.
The art room on the other hand has a collection of different brushes and materials, as well as examples of objects built with them. We want users to explore this room and play around, learning how to make their own art inside of Neos.

There are some other rooms that will come as part of the experience, particularly streamer/camera room to learn about our interactive camera system and Twitch integration and world-building room, focusing on the developer tooltip, LogiX and asset importing.
After a few extra additions, we’ll open up the first basic version of the metaverse training center to everyone and keep on improving and expanding the experience. What other rooms would you like to see?
We’ve seen even more people stream Neos this week! We’re very grateful for you all to help spread the word about Neos and showcase what it can do. The Creator Jam has also released their first trailer!
[h2]Creator Jam Trailer[/h2]
Medra has been hosting weekly creator jams for over 6 months now and over the time the community has come together to build some really amazing things! He has put together a trailer showcasing some of the cool creations from the past half year, check it out here!
The creator jams are held every Sunday and everyone is free to participate. Join our Discord and check out the #meetups channel, every week has a different building theme.
[previewyoutube][/previewyoutube]
[h2]Just Dancing and Rukio live streams[/h2]
Rukio has been working on a brand new world with our team member Coffee called “Just Dancing”. It is designed for everyone who likes to dance in VR and stream, with beautiful music-controlled particle effects and Twitch integration to you get cheers whenever someone follows or subscribes. Check out his Twitch channel here: https://www.twitch.tv/ruikio

[h2]Visit the Garden Temple[/h2]
GearBell, one of our most prolific creators, has published a beautiful world called “Garden Temple”. It’s a beautiful blend of nature and architecture, with sound effects and butterflies!

[h2]Scream to power up Dragon Ball avatar[/h2]
Danyy59 has been making fun and quirky characters in Neos for a while. One of his latest creations is a Dragon Ball Z character, which powers up as you scream! The audio is a little quiet in the video, but you can see it in action here:

[h2]Bobotron's Must Play Monday[/h2]
Neos was also recently featured on Bobotron's YouTube channel as part of his Must Play Monday segment. He goes over his first Neos experience. Thank you Bobotron!
https://www.youtube.com/watch?v=71PzT5D3p1U
With the major subsystems of the new text rendering system mostly done, the work is now
wrapping up (pun intended). What’s remaining is building a new renderer that takes all the data and puts everything together to actually display the text into Neos. You’ll be able to play with the new system very soon!
Once that’s done, the work will begin on writing a new UI framework from scratch with the new font rendering at its core, which will pave the way to overhauling the actual UI and UX for a much improved user experience.
There are some other cool things we have planned for the system. Since there can be many different “consumers” of the processed text structure, we’re going to be adding more renderers. For example you’ll be able to directly generate a 3D mesh out of the text for cool effects and signs!
And last, I'd like to thank you all for being patient with the development of this new system. It’s one of the most technically complex and challenging parts of Neos that was long overdue for a proper replacement and I want to make sure it’s a solid foundation for many years ahead and that will let us build the UI and UX that we all really want.
We have another weekly update for you. We’ve made more significant progress with the new text rendering system, it’s getting nearly complete now! We’re also devoting more resources to build the Metaverse Training Center a new starter user experience.
The Creator Jam has released their first trailer, showcasing more than 6 months of collaborative community creations.
We’ll be going live on Twitch in a bit, so feel free to join to hang out with us and ask any questions about Neos and its development!



Text layout and formatting for the new rendering system
Most of the development time has been focused on the new text rendering system that will become the foundation of the new UI. We have implemented a major subsystems for getting the text prepared for rendering: parsing of markup, shaping and layout (putting glyphs on lines, word wrapping, alignment, justification…).
This system takes a string of text as input and processes it in several stages, each one designed to be modular and extensible, so we can easily expand on each part of the system in the future to support more features and languages.
The first part will take the text and process it into a data structure in memory, automatically parsing various formatting tags. For example for bold, for changing color, for changing size or for changing alignment.

Once the text is parsed, glyphs are generated from the referenced font files and properly shaped (positioned relative to each other). This currently includes simple kerning and in the future will include more complex shaping rules (likely using the HarfBuzz library) for better quality and support of certain languages like Arabic.
The shaped text is then laid out into lines, automatically wrapping words onto a new line if they exceed the given region size. This ensures that the text will nicely wrap itself into the UI layout or whatever area you define. In the future this will be extended to support more complex shapes, for example having the text wrap around an embedded image or into a shape.




The laid out glyphs are then post-processed with additional layout rules and modifications - aligning the symbols to left, center, right or justifying them, resulting in the final positions of the glyphs.
The system was designed to handle the different stages in an abstract way in a data structure in the memory and so it can be updated on the fly - for example changing a font or size will only re-run some parts of the process, saving on performance.
The resulting data structure can then be read by different “renderers” - components that actually generate the resulting visual and the main remaining piece of the puzzle before the new text system is fully functional.
For testing purposes we have used a very primitive renderer that renders the glyphs into a Neos bitmap. It doesn’t provide high quality, but it’s sufficient to make sure the rest of the system works as intended. Here are some examples of it in action:


Testing! This is using the default font. Now using Criticized font! The NEOS logo font and back to normal. Or would you like some stencil? Here's some! And now is time for some corruption! Enjoy! Also Calibri

Testing. This is the default alignment.
Another line. This one should be right aligned. Testing if it translates to the next line as well.
This is another explicit line. Still right aligned?
This one should be center aligned.
Nice and in the middle.
Borp.
And this block should be justified. I don't really know what else to type here, but I need to type a lot, so there's enough text to span multiple lines and fully justify this. Potato. Gorp. Norp blup bop!
If you’re interested to learn more technical details about this new system, check out the #devlog channel on our Discord!
Metaverse Training Center - building Avatar and Art room
We’re also making progress on another of our major upcoming features, the “Metaverse Training Center”. This will serve both as new introductory experience for our users and learning resource, putting together all the tools and information to get users started with different features that Neos offers.
Currently we’re finishing two main rooms - the avatar room and the art room. In the avatar room users will be able to learn everything they can about setting up an avatar in Neos. They’ll be able to customize their own version of Neos Bot with simple UI, add accessories or easily create their own.

We have made a few other gadgets to help along - switchable mirror/portal and adjustable height meter. In the other rooms you can learn how to import own avatar models, setup dynamic bone chains and more.
The art room on the other hand has a collection of different brushes and materials, as well as examples of objects built with them. We want users to explore this room and play around, learning how to make their own art inside of Neos.

There are some other rooms that will come as part of the experience, particularly streamer/camera room to learn about our interactive camera system and Twitch integration and world-building room, focusing on the developer tooltip, LogiX and asset importing.
After a few extra additions, we’ll open up the first basic version of the metaverse training center to everyone and keep on improving and expanding the experience. What other rooms would you like to see?
Community Highlights
We’ve seen even more people stream Neos this week! We’re very grateful for you all to help spread the word about Neos and showcase what it can do. The Creator Jam has also released their first trailer!
[h2]Creator Jam Trailer[/h2]
Medra has been hosting weekly creator jams for over 6 months now and over the time the community has come together to build some really amazing things! He has put together a trailer showcasing some of the cool creations from the past half year, check it out here!
The creator jams are held every Sunday and everyone is free to participate. Join our Discord and check out the #meetups channel, every week has a different building theme.
[previewyoutube][/previewyoutube]
[h2]Just Dancing and Rukio live streams[/h2]
Rukio has been working on a brand new world with our team member Coffee called “Just Dancing”. It is designed for everyone who likes to dance in VR and stream, with beautiful music-controlled particle effects and Twitch integration to you get cheers whenever someone follows or subscribes. Check out his Twitch channel here: https://www.twitch.tv/ruikio

[h2]Visit the Garden Temple[/h2]
GearBell, one of our most prolific creators, has published a beautiful world called “Garden Temple”. It’s a beautiful blend of nature and architecture, with sound effects and butterflies!

[h2]Scream to power up Dragon Ball avatar[/h2]
Danyy59 has been making fun and quirky characters in Neos for a while. One of his latest creations is a Dragon Ball Z character, which powers up as you scream! The audio is a little quiet in the video, but you can see it in action here:

[h2]Bobotron's Must Play Monday[/h2]
Neos was also recently featured on Bobotron's YouTube channel as part of his Must Play Monday segment. He goes over his first Neos experience. Thank you Bobotron!
https://www.youtube.com/watch?v=71PzT5D3p1U
What’s next?
With the major subsystems of the new text rendering system mostly done, the work is now
wrapping up (pun intended). What’s remaining is building a new renderer that takes all the data and puts everything together to actually display the text into Neos. You’ll be able to play with the new system very soon!
Once that’s done, the work will begin on writing a new UI framework from scratch with the new font rendering at its core, which will pave the way to overhauling the actual UI and UX for a much improved user experience.
There are some other cool things we have planned for the system. Since there can be many different “consumers” of the processed text structure, we’re going to be adding more renderers. For example you’ll be able to directly generate a 3D mesh out of the text for cool effects and signs!
And last, I'd like to thank you all for being patient with the development of this new system. It’s one of the most technically complex and challenging parts of Neos that was long overdue for a proper replacement and I want to make sure it’s a solid foundation for many years ahead and that will let us build the UI and UX that we all really want.