Varjo Technologies is an extended reality (XR) company specialising in virtual and mixed reality (VR/MR) headsets that has been pioneering the highest-quality technologies in the sector since 2016.
Following a series of product launches in October last year, including its Aero headset and Teleport solution, Varjo Technologies has stepped up efforts to create a platform for future Metaverse technologies.
XR Today spoke with Urho Konttori, Founder and Chief Technology Officer for Varjo Technologies, the world’s premier enterprise headset manufacturer based in Helsinki, Finland, about his company’s Reality Cloud global streaming update unveiled in mid-January.
XR Today: Can you tell us some general information about Varjo’s Reality Cloud solution, and how it compares to current Metaverse solutions in the market?
Urho Konttori: We announced last summer that Varjo is working on something called Varjo Reality Cloud, which will transform the way computing is done in the future, especially how VR and MR headsets operate and enable infinite computing power of the cloud on any headset.
At this moment, roughly half a year after our announcement, we are showcasing that our alpha customers have been trialling Varjo Reality Cloud and its fantastic computing and streaming capabilities over the last couple of months.
One of our users, Rivian, is a world-renowned electric carmaker who has gone public on how amazingly transformative this service has already been for them currently.
When you participate in any kind of [trial], whether as a consumer or a company, it typically comes with some surprises and glitches, which is why we run the full service to actually fix them.
Participating as an alpha customer is a great way to push for new requests [and bug fixes], which we prefer. We’re very willing to listen to our clients, especially at that stage, and make modifications.
So, while you lose some time and patience as an Early Access Partner, you also gain a lot as well, not only in the way services are customised, but also by performing faster than others.
The Varjo Reality Cloud is now capable of running any PC VR application in the cloud with fully customisable services in our server infrastructure.
You can now say, “I want more graphical processing units (GPUs), 100 central processing units (CPUs), or to have a terabyte of random access memory (RAM) immediately at my disposal for this one session,” which is all very easy to set up.
What we have been doing with some alpha partners is ‘over-optimising’ the experience for particular use cases. For example, with Raytheon, we have been working together with Autodesk, one of our longest partners to date, so that when you want to hear feedback from peers or managers as a designer working on a model, you can send them a link to the Reality Cloud.
You can add notes such as, “Please look at this mode. I’ve set everything up in VRED and will be there myself at 4pm,” and when they click the link, the connection is made from your computer to the cloud, where the model is waiting, because it has been fully-prepared with no hassle.
We can simplify licence management of even the highest-end software like Autodesk VRED so the user really doesn’t need to worry about installing it or managing licences just for reference.
Autodesk VRED is used by nearly all automotive companies in their design processes, and a single seed costs over $20,000, or an annual subscription for $6,000 USD a year, so we’re referring to the highest-end software possible.
—Furthermore, when the manager clicks on the link, the cloud sets up the connection to render the car and Collaborative Mode instantly. You can then see your colleague, the designer who asked for the feedback, to start collaborating, discussing, annotating, and making notes to remove the friction of this whole process, using the Cloud at your disposal whenever it’s needed.
You don’t have to wait for managers to get a VR computer, which could easily take months in today’s world of limited access to silicon and high-end GPUs, but one can easily do this with our solution. You have much more variety of components you can connect to a Varjo headset to get that human-eye resolution [at only 30 megabytes per second] with perfect image fidelity.
This [allows our solution] to run on real computer networks and inside corporate networks that don’t always provide infinite bandwidth to everybody demanding it.
So, I suppose, [there are multiple benefits such as] the scalability of the cloud, better-than-local image fidelity due to its scalable infrastructure and turnkey solution, which really removes obstacles from using high-end applications.
XR Today: What about the more technical specifications behind the Reality Cloud solution? Which tools have been included in it, and how could it become a stepping stone to greater Metaverse innovations in the future?
Urho Konttori: Overall, I think cloud computing is an interesting part of the Metaverse that has so far been slightly overlooked in the consumer domain. For example, consider NVIDIA’s Omniverse, which is a distributed GPU computing platform where you’re able to run full-world simulations.
For application programme interfaces (APIs), you’re able to do real-time ray tracing for design workflows in a highly scalable Metaverse, which is why it’s called an Omniverse.
It’s something that is also possible in the consumer domain, and the key difference between running something like a full Metaverse instancing in the cloud with rendering is that, for every single user receiving that rendered stream of the Metaverse, their computer has nearly infinite data bandwidth to the other nodes in the network.
So typically, multiplayer games are limited to roughly 20 users at a time, and in some cases like Fortnite, you can have 100 people on the same island at the same time, but not in the same scene, due to the fact that whenever you have any kind of interactions with other people, there are places in the world and movements you must synchronise with all 100 players simultaneously.
It creates a latency and network bandwidth issue to scale beyond a few tens of people, but this is not the case when all rendering machines are in the cloud. They’re not limited to 100 megabits per second like our household connections, especially with the upload stream. They’re running at gigabytes per second, 10s of gigabits easily, allowing you to create a much more scalable, interactive, and real-time Metaverse [with] a cloud-hosted renderer for consumer use cases.
I think it’s one of those interesting aspects to see in the future how the Metaverse will grow overall. We have some gaming companies looking into cloud-predominant infrastructure models, and I think Mainframe Industries come to mind instantly, but there are many others as well.
I could certainly see that, at some point, the Metaverse could actually run similarly to the way computing does on the cloud, rather than on client devices themselves, but who knows? It’s a new domain and anything can happen, but it certainly keeps my imagination going at light speed.
XR Today: Looking at your solutions such as the XR-3, VR-3 and Aero headsets, could you tell us some interesting use cases either in the past or potential scenarios, that you could use Reality Cloud that would really benefit people?
Urho Konttori: The Reality Cloud is a rendering solution that is one of the key tools enabling scalability in the industry, which is limited in that, anybody who needs to use immersive technologies in the high-end domain needs to purchase high-end computers, keep them updated, and so forth.
Cloud rendering enables much lower-level computer requirements needed to drive headsets. You can have an ordinary laptop connected to the headset, and then have the cloud do the actual heavy lifting.
It also means that, for deploying data, you only deploy once to the cloud, and every single user of that data gets instant access to the latest models, data sets, or applications, centrally managed in one place.
We think it’s one of those key enablers for companies to scale the use of immersive technologies, because it removes barriers that typically have hindered their expansion.
We think that’s the domain we are hitting the most, but as an ordinary person, it’s going to be [that] the biggest values are similar to those you would get from services like GeForce Now.
You don’t need to purchase the latest PC, but you will always have the latest PC, and whenever NVIDIA launches a new generation of GPUs, it gets updated to the cloud.
You are also always running on the latest [equipment], with no additional expense or worry about selling your old computer to somebody else, and have access to fully scalable architectures.
Conversely, we do not offer the Varjo Reality Cloud to consumers and are focusing on the professional work for flows and companies for now, which is why we don’t yet talk about consumer cases.
XR Today: What do you and your company believe the Metaverse will look like over the next coming years, and how do you think firms can work together to realise its’ potential?
Urho Konttori: It’s a pretty deep question, for sure, but I do see that we should consider its users like the internet, [which is] a singular entity and enables deep connectivities between all services.
At the same time, you still have “secret walled gardens” so that you cannot access the latest classified data from the US military or car companies. While all of those things are accessible and linkable, we still need to place security there. This leads to similar situations with the internet overall today.
You have multiple, different types of client applications to access data, hosted on the internet, and it’s going to be the same with the Metaverse as well. The important point is to get a similar standardisation of access to the data as we have with web browsers today.
We should be able to always access data using a web browser, but also have the need for something more specific, tailored, and optimised to utilise that same data.
When you consider cloud rendering, it’s interesting that now, the client itself doesn’t need to concern itself about which application it’s running. You can have a thin client, like a web browser, to access any kind of content or application running in the actual cloud infrastructure, keeping everything fully connected.
I think that’s the overall architecture I see taking shape, but of course, you will have certain bubbles in there, for example, like with what Facebook is doing with its services to keep people in that “walled garden,” even though the internet and the web are part of it.
It links to other sites as well, because most people post on it and [Instagram] as well, so I do think we’re going to see a lot of similarities to the actual web use, overall, in the future, and more of our lives will move into this fully-digital world of the Metaverse as opposed to only existing in the real world.
XR Today: What else would you like to say about Varjo’s current operations?
Urho Konttori: Varjo is now five years old, and we’ve always focused purely on the business-to-business (B2B) sector to change the way people train and design the future. We see ourselves as enabling companies to really transform their age-old techniques of training and designing into a new generation.
Three years ago, when we launched our first human-resolution product, it started this transformation, and when we look back at that stage, we were basically kicking off the designing sector really fast by shifting from doing real-world mockups to doing design reviews only in the virtual domain.
COVID accelerated this tremendously in larger corporations. Our biggest customer at the moment is a consumer electronics giant, and over there, every single designer uses our headsets, because it allows them to move faster and operate in a completely hybrid manner of work effectively.
Simultaneously, we’ve been transforming the way aviation conducts training, replacing the need for full-dome simulators that cost millions of dollars each, which are really difficult to access due to their costs.
There are not many available for any kind of organisation, at any given time, but transferring pilot training to the virtual domain is obviously less expensive and means you can spend more time training.
The interesting thing that started happening two years back was that trends gradually shifted from training pilots how to fly to pilots learning how to fly their next mission in [air defence]. Now it has shifted to, “let’s not even go and fly a plane, let’s only do it virtually.” This suddenly gives you much better access to have 20 people participating on a particular mission, or even hundreds if needed.
For example, if you were to fly an F-35 fighter jet, it costs $15,000 USD per hour to fly it, so it’s a couple of months of pay for a lot of people just for one hour of flight, which pollutes more than a civilian aeroplane with over 300 people onboard.
So again, once you start getting into the level of transformation we see, especially in the defence industry, which is starting to completely replace the old ways of doing things in the physical world, and said, “Hey, we can actually do much better in the virtual domain.” That’s the transformative part, and certainly something we’re seeing happen in the defence space quickly.
There’s also the civilian aviation side, and the European Aviation Safety Agency (EASA) qualified various headsets for receiving full [training] credits as if you were actually flying a plane.
If you wear a Varjo headset and use VRM Switzerland’s motion platform simulation system, the simulation is so realistic that you get the same credits as if you’re actually flying.
It’s fantastic to see the regulatory side [of aviation] moving really fast, and in all honesty, I expected that to happen in a couple of years, not this year or last year, so I’m super happy with that move.
We’re now having similar conversations on the civilian aviation side about whether we could actually allow every single commercial pilot in the world to have a VR headset to do more regular training rather than once a year typically seen with most commercial pilots.
Pilots would enter training on a simulator for one day and are typically flooded with dangerous conditions such as an engine on fire, thunderstorms, and others.
So how could we allow pilots to keep on their toes regularly, with more regular interval training? I think we’re seeing technology enable things to be done differently, which to me is very exciting and motivating.
Of course, we’re seeing the Varjo Aero we launched in October last year is our first consumer offering, so consumers can now have access to the same fidelity as professional pilots.
It’s something that offers consumers access to the best technology in the world by purchasing the Aero, which also works with any PCVR experiences or applications you might have, which are better with the headset and have no compatibility issues.