VRSionUp!7 — Hubs Study — Meetup event in Mozilla Hubs

GREE VR Studio Laboratory
9 min readOct 14, 2020

--

Internship Report from Virtual Beings

Hi, my name is Liudmila Bredikhina. I’m a virtual intern at GREE VR Studio Laboratory, and I’m a Master’s Degree student at the Geneva University. I’m currently majoring in Asian Studies, and I research Japanese Virtual Beings, such as VTubers.

VRSionUP!7 speakers in Hubs

Introduction

On July 15th, 2020, during big changes surrounding VR live entertainment, a new edition of VRSionUp took place. VRSionUp is a set of VR research workshops series curated by GREE VR Studio Lab on the theme of “Discovering innovation through VR”. International and Japanese creators are invited to present on specific technologies (How), on concepts behind innovations (Why), and on the meanings that they hold (What). VRSionUp workshops are a place where academia, entertainment, and technology meet in the pursuit of innovation. The 7th edition of VRSionUp focused on Mozilla Hubs as an early study. Mozilla Hubs is a browser-based VR environment used for meetups, social activities, and presentations. Attendees could register on Connpass (link), attended the event in Hubs, or watched the live broadcast on YouTube (archive link). Questions to speakers were sent via Sli.do.

Archive in Japanese
https://vr.gree.net/lab/live/vrsionup/vrsionup7-20200715/

Public viewing in Hubs

The Hubs room for viewing VRSionUp7

On the image above, you can see the spatial organization of the Hubs room where the attendees could watch the conference (the room’s URL was kept private and the URL link was sent only to those who registered). Speakers were in another room, which was broadcasted in real-time on the left-hand side screen. The same image was broadcasted on YouTube with a couple of seconds of delay. Speakers and attendees had various forms of avatars: customized objects, default Hubs robots, or a some-what photo-realistic portrait using ReadyPlayerMe.

Below you can see the technology behind VRSionUp7 broadcast.

Speakers and their talks

Toya Sakaguchi (Shizuoka University / GREE VR Studio Laboratory Intern) was the first to give a talk. He explained how Spoke could be used to organize events and gave several tips. For example, he demonstrated how to broadcast, edit scenes, change volume, adapt avatar size, install media, view PDF, and use PannerNode. In the end, he presented various methods of broadcasting entertaining-content in Hubs using Twitch, OBS, YouTube, and Discord.

Next up was Nagamine (Fukuoka XR Division) and he explained how they have been using Hubs as social VR at the Fukuoka XR Department. The XR division has been using Hubs for VR public viewing (ex, watching YouTube live broadcasts in Hubs), meetings, events, or nomikai (drinking meetups) with WebCams (people share their WebCam image in Hubs and play together in the VR environment). As Nagamine explained, the Fukuoka VR Division has been using Hubs because it’s user-friendly, accessible on smartphones and PC/Mac without HMD, and has the screen sharing/webcam option.

From left to right: Toya Sakaguchi, Nagamine, presentations in Hubs

Next William Chan (Engineer, Designium/XR) explained how Designium has been using Hubs for internal communication between the two offices: Tokyo and Aizuwakamatsu (located in two different prefectures). At Designium, they re-created the Aizuwakamatsu castle in Hubs and the employees of both offices used the location for a meet-up. After employees got used to Hubs, the meetup felt like a school trip, and everyone took photos to commemorate the event. As a bonus, William Chan showed a re-production of the Tokyo office in Hubs.

Yukihiko Aoyagi (Digital Standard Co.) gave a talk about the Japaneseization of Hubs and EC (e-commerce) sites. He wants to make Hubs user-friendly for Japanese creators to meet up, create projects, and launch e-commerce businesses. In order to achieve those outcomes, Hubs had to be customized for Japanese users by translating Hubs data with code. Yukihiko Aoyagi then showed an EC shop in Hubs and discussed the pros-and-cons of the mobile and desktop versions of the App. He also gave tips on how to design a UI (user interface), and the necessary steps for creating an EC shop in Hubs.

From left to right: William Chan Long Yin, Yukihiko Aoyagi, presentations in Hubs

Nakamura (REALITY, Inc., Its company name was changed from Wright Flyer Live Entertainment Inc. since 1st October.) talked about how to manage Hubs events with staff. For a Hubs event to go smoothly, just as with physical location events, room moderators and staff are necessary. Nakamura proposes to have three different types of staff during an event. For example, staff A could be responsible for object creation and cleaning-up the Hubs room after the event. Staff B could focus on muting people who did not turn off their microphones. Staff C could be responsible for the fly mode and filming the event. His presentation slides are available at SlideShare.

https://www.slideshare.net/vrstudiolab/vrsionup7-hubs/1

Yusuke Yamazaki (Tokyo Institute of Technology / GREE VR Studio Laboratory) talked about the research life of VRStudio and the Enthusiastically Shared Technology “VibeShare”. Yusuke Yamazaki has been developing VibeShare at GREE VRStudio Lab. HapBeat is VibeShare’s predecessor and was used at SIGGRAPH19 to create interactions between physical and virtual venues that went beyond a simple image transmission. “Hapbeat” is a necklace-type wearable device that transmits sound vibrations onto the body, providing an immersive sound experience. VibeShare is a new technology that connects players with their audience by sending haptic feedback based on their comments. His presentation slides are available at this link.

From left to right: Nakamura, Yusuke Yamazaki, presentations in Hubs

Akihiko Shirai (GREE VR Studio Laboratory) gave two talks. He began by introducing different research results at GREE VR Studio Lab that are archived in a Hubs room that you can see on the image below.

VRStudio OpenLab 2020 in Hubs

VRStudio OpenLab 2020 in Hubs https://hubs.mozilla.com/DyECcpy/vrstudio-openlab2020

The Hubs space was organized by zones, each zone had videos and slides about each project. Akihiko Shirai explained Koeuranai (a voice changing app) and played a video demonstrating the technology. He then moved to the international zone of avatar society research that focuses on current trends. In his second presentation, Akihiko Shirai talked about Hubs’ implementation for WebVR, its history, the technical aspect of Hubs, and the GREE VR Studio Lab’s first experience with Hubs during IEEEVR2020 (you can read more about it at this link). At the end of the presentation, Akihiko Shirai announced VTech Challenge 2020 (you can read more about it at this link). This second edition focuses on various technologies and aspects surrounding virtual live-entertainment and virtual beings. The event will take place in Hubs. Akihiko Shirai presentation slides are available at this link.

Akihiko Shirai presentation in Hubs
Akihiko Shirai presentation in Hubs

Socializing in Hubs

About 10 people (who were invited before the event) watched the conference in Hubs, approximately another 240 watched the live stream on YouTube. Only those who registered could enter the Hubs room via an URL. As was mentioned in the beginning, the speakers were in another Hubs room. The organizers decided to separate the viewing venue from the research results archive venue where the speakers were because they wanted to offer both the interaction experience (after the conference) and the viewing quality in Hubs. When viewers were in the viewing venue, they could focus on the presentations without getting distracted by surrounding them with research results archives.

A photo from the networking party https://hubs.mozilla.com/DyECcpy/vrstudio-openlab2020

Once the presentations were over, the URL of the Hubs research results archive room was announced in the Hubs viewing room and on YouTube. Not only registered viewers but also those watching the YouTube live transmission could open the link and join the venue. Once viewers entered the room they could consult different research projects, interact with one another, and with speakers. After a couple of minutes and once everyone got used to Hubs, people began creating 3D objects, drawing, and taking photographs. While some continued debunking the presentations, others talked about future VR events like Virtual Gakkai 2020 (link). A Japanese VTuber, Dokokano Usagi (link) drew her rabbit logo with Hubs pen. Yusuke Yamazaki changed his avatar to the HapBeat necklace. Others created 3D beer and sushi.

A shot during the Networking Party in Hubs https://hubs.mozilla.com/DyECcpy/vrstudio-openlab2020

Main takeaways

In the blog posts about IEEEVR2020 and EMTECH, I’ve discussed how both events lacked interaction between viewers and speakers. Compared to those two events, VRSionUp7 was better in terms of interaction and participation. Some people came with their friends, and while some of them were going deep down the rabbit hole of tech talks, others played with objects and drew messages while waiting for their friends. At the same time, there were those who came alone and just wanted to look at the research results. Nobody got bored or felt disconnected.

Conclusion

In conclusion, Hubs can be used for a variety of purposes: meetups, EC, academic events, tech talks, internal communication, business, etc. Due to COVID-19 restrictions, Hubs became a space for R&D and online research. While the general outcomes of the event are positive, few things need to be further thought of. First of all, how can we make Hubs events more accessible to those who do not know Hubs? While tutorials do exist on the Internet and only 3–4 clicks are required to access Hubs, some users might face difficulties. Second, how can we judge if the event was popular or not? VRSionUp7 had approximately 240 live-viewers on YouTube and while that might seem like a rather big number, is it really?

We also need to think of the best methodology for archiving the conference content. For example, slides can be made available for public viewing, the video archive can stay up online, the original event description on Connpass can be updated and a blog post can be published on Medium. However, those are all different SNS sources and require knowing when and how different information is uploaded. The archiving process is LTV (lifetime value): once the content is made available online for public consultation, it will generate a certain value during its lifespan. Thus by archiving content we can produce LTV. Why is it important? Because those who couldn’t participate in the event or didn’t know about it or didn’t have an interest in it can still learn and educate themselves by consulting the archive. The goal of research, development, and conferences is not to produce value and meaning now, but in the future. And the only way to touch the audience of tomorrow with today’s content is by making it available for consultation even tomorrow.

Our next event will be Virtual Beings World at SIGGRAPH 2020! VirtualBeings World Showcase 2020 “New Play Together” is the 2nd Birds of a Feather (“BoF”) workshop to showcase the rapidly growing phenomena of “Virtual Beings”(AI driven storytelling, virtual humans, VTubers, virtual influences, digital doubles, virtual idols, and virtual entertainers).

About Liudmila Bredikhina

Liudmila Bredikhina is an anthropologist, working mainly on avatars and virtual characters that enable people to perform and express their virtual identity. Her research is conducted through a gender approach, questioning how human interactions, self-expression, and kinship relations are re-negotiated in a more than human world.
https://twitter.com/BredikhinaL
https://unige.academia.edu/LiudmilaBredikhina

--

--

GREE VR Studio Laboratory

R&D division for Creating a new future with XR entertainment research and VTuber industry with REALITY, Inc. in GREE Family.