the victoria and albert museum in london is a treasure trove of beautiful artwork, some three—dimensional wonders and some masterpieces are so realistic you could walk into them. which is precisely what is about to happen in the painting gallery where in amongst the paintings, we are about to get stuck into some very modern art. , ., ., . to get stuck into some very modern art. , . ., . ., , to get stuck into some very modern art. , . . . . , , art. there is a dancer and she is wearin: art. there is a dancer and she is wearing pyjamas- _ art. there is a dancer and she is wearing pyjamas. 0r— art. there is a dancer and she is wearing pyjamas. or is- art. there is a dancer and she is wearing pyjamas. or is that - art. there is a dancer and she is i wearing pyjamas. or is that fashion these days?— wearing pyjamas. or is that fashion | these days?_ this these days? that's the fashion. this is an augmented _ these days? that's the fashion. this is an augmented reality _ these days? that's the fashion. this| is an augmented reality performance you can see using a headset and specifically designed to take place in this space. it's the brainchild of roland who had a dancer's performance
captured in a volumetric space and a cgi landscape built around her. it is interactive. touch the globe are the flowers and they will react. the lens is looking at my hands as well as the space.— well as the space. some of the ob'ects well as the space. some of the objects looked _ well as the space. some of the objects looked really _ well as the space. some of the objects looked really gooey - well as the space. some of the | objects looked really gooey and well as the space. some of the l objects looked really gooey and i didn't feel like touching them. i didn't feel like touching them. i don't know if the painters would have _ don't know if the painters would have used — don't know if the painters would have used this technology if it was available _ have used this technology if it was available at the time. i think it's interesting to position the stream surrounded by turner and constable
work _ surrounded by turner and constable work they — surrounded by turner and constable work. they might have loved or hated it. , , ., ., work. they might have loved or hated it. this experiment looks at how a virtual reality _ it. this experiment looks at how a virtual reality art _ it. this experiment looks at how a virtual reality art experience - it. this experiment looks at how a | virtual reality art experience might work in the future and once the dancer disappears the whole thing simmers down to work more in harmony with its surroundings. when i say simmers down... i’m with its surroundings. when i say simmers down. . ._ simmers down... i'm loving that! that every _ simmers down... i'm loving that! that every doctor _ simmers down... i'm loving that! that every doctor who _ simmers down... i'm loving that! that every doctor who and - simmers down... i'm loving that! that every doctor who and star l simmers down... i'm loving that! - that every doctor who and star wars experience i've ever wanted right there. it experience i've ever wanted right there. , ., , . ,, there. it feels weird to be back in there. it feels weird to be back in the room- — there. it feels weird to be back in the room- l— there. it feels weird to be back in the room. i got— there. it feels weird to be back in the room. i got lost— there. it feels weird to be back in the room. i got lost in _ there. it feels weird to be back in the room. i got lost in that. - there. it feels weird to be back in the room. i got lost in that. the i the room. i got lost in that. the normal world now feels strange. yeah, we work on click, we don't work on the normal world. someone taking that concept to the next level is mark who has been asking where all of this could go next. the answer is the metaverse. the
metaverse, if we think of the internet is something we look at, the metaverse is a version of the internet that we are inside. the idea is we will experience the metre verse as an avatar, a virtual version of ourselves that we control as we explore this new online frontier. i'd say there are examples of the metaverse already. if you look at some video games, for instance, they are, you know, digital worlds that you can interact with as an avatar. i'd see the metaverse as an extension of technologies that we currently have. i think a lot of people see the future metaverse as expanding on that experience to include notjust gaming, but maybe things like a digital workspace or digital events, digital socialising. andrew bosworth is from meta — the company formerly known as facebook. the tech giant says
it's transforming itself from a social media company into a metaverse company. we spoke using oculus virtual reality headsets, appearing as avatars inside software designed for virtual meetings, called workrooms. ok, so, boz, here we are in what might be considered a representation of what the metaverse could be. yeah, for us, the metaverse is a spacial construct, as opposed to the previous web, which was really a very linear kind of 20, flat thing. we want this one to be immersive — something that you could, were you so inclined, really experience in an embodied way. now, of course, it doesn't mean it has to be virtual reality — it could alsojust be on a phone or on a desktop computer. you might have noticed that we're using the tools of the metaverse to create a good portion of this item. my avatar has been created by a couple of companies — ready player me and oz. they already create tools for people to make avatars from a photo. it's this virtual version of us which will travel
between online experiences in any metaverse. bosworth believes new online economies will spring up around these pixelated people. and then, over time, what i'm most excited about is an economy there. and i mean, you know, economy notjust of digital goods, sure, and entertainment, that's great, but also services. in an immersive environment, i'm gonna have an avatar. i'm going to need a stylist, i'm going to have a home space, i'm gonna need a decorator, you know, and these are — i'm gonna invite my friends over to my home space. when we consider that video games already sell virtual goods like clothes or vehicles, we can see where this idea draws its inspiration. but there's other parts of online culture which these companies might be less enthusiastic about. now, in the contemporary online experience, there's a lot of online hatred out there, a lot of online abuse and misinformation and things of that nature. how are you going to avoid those kinds of experiences seeping into a metaverse?
yeah, so one thing that's interesting about metaverse experiences is that we imagine them being real—time. do you really want the system or a person standing by, listening in? probably not. i don't. that feels like a real violation of privacy. but if the conversation is happening in real time, then how can we modify content without listening in? and so, i think we have a privacy trade—off against, you know, if you want to have a high degree of content safety — or what we would call integrity — well, that trades off against privacy. but i do think that we're gonna have, as a society, a lot of hard conversations ahead of us around the trade—offs between privacy, content and interoperability. yeah, the more time we spend in these digital worlds, the more data about ourselves we may be giving up and, obviously, that is a privacy concern — especially if you are going from one domain to another. maybe you're going from a digital work zone to a digital gaming zone.
do you want the same identity to be associated with both? there are calls that maybe you would have to verify your identity and match that to your avatar, so that people know who you are. but that in itself raises a whole bunch of privacy concerns. maybe not everyone would be comfortable doing that. microsoft has adapted its workplace meeting software teams for the metaverse by creating a system called mesh. it's designed to work with a variety of different devices, including virtual and augmented reality. ar, as it's known, projects graphics on top of the real world using headsets like microsoft's hololens or mobile phones. but after nearly two years of lockdowns and meetings with friends, family and colleagues via video call, is now the right time for an idea like this one? there's quite a few people that have got fatigued by having to have video chat meetings and things of that nature, and they realise they now crave human contact. human communication is about 5% speech, it's 95% everything else.
i've been in my — in my, you know, living room with the entire team around the table, right? making eye contact, where all the gestures are coming into the right place. and i can touch that digital object and instead of having a person next to me, having a walking one on one, i can have the avatar version of that person one on one next to me. so it changes completely the — the, you know, call it the "screen fatigue" we're feeling today. the next piece of the metaverse puzzle isn't just about seeing these virtual worlds, but feeling them as well. work on haptics or forced feedback — the ability to touch and feel while inside a virtual space — has been going on for years now. the artist formerly known as facebook, meta, has revealed that it's been working on a glove that will let the user feel sensations, like holding an object. the glove has a number of sensors that measure the wearer's movements and air pockets across the glove's surface inflate to create sensation. these gloves aren't quite ready for prime time yet,
but they're an indicator of the kind of research that's going on behind the scenes. the big question, though, is will people embrace this new vision for our online lives? i think it depends on the specific application. we've seen in gaming lots of people really do enjoy those experiences. they use avatars, they interact with the world in that way. would you want to sit in a virtual office as an avatar? i'm less sold on that idea. it seems the metaverse is coming. but its success rests solely in our hands. hello and welcome to the week in tech. it was the week instagram said sorry for wrongly disabling the account of artist @metaverse for a month. it's since been restored. ibm and samsung unveiled vertical transistors on a semiconductor. why is this interesting? stacking more transistors on a chip could one day extend phone battery life!
and meta has opened its vr space horizon worlds to quest app users over 18 in the us and canada. the metaverse has gained a notable real—world interest as clothing giant nike has bought a company that makes virtual sneakers. rtfkt, pronounced artefact, is an nft studio that, alongside footwear, builds other digital collectible items. a research lab is making polymer lenses in parabolic free—fall flights which stimulate weightlessness. it looks fun, doesn't it? omer luria led the research on this method, which could help manufacture cheap glasses for the billions of us who need to correct our vision. the mixture is usually dropped into a mould, suspended in fluid, so the lens can take shape before being cured by uv light. and finally, even though it's around 93 million miles away, the sun is no longer outside of humanity's reach as nasa's spacecraft parker solar probe has been confirmed to have touched the sun. the craft, launched in 2018, first flew through the corona — the sun's atmosphere —
in april, but the data has only just been confirmed. been there, sun that! now, we had a jolly good time earlier at the v&a museum, didn't we? oh, we did. i felt like i saw art, and ifelt like i was in art. mm, indeed. uh, well, if we are all going to be sucked into the metaverse, then we're going to need to populate it with realistic—looking people. at the moment, i'm not sure whether i want my avatar to be of me, a completely fake version of me, or a photorealistic me. neither are mark zuckerberg and his mates. but at least you and i already have a few versions of ourselves on file. we certainly have. well, as we i mentioned earlier, the dancer inside the sonzai experience was very real — and she was recorded not using motion capture but using volumetric capture, something that we've seen before on click, most notably
when mark visited the intel studios and dunked that rather improbable hoop. yes, he was very pleased with himself. i'm not sure it really happened, to be honest. but anyway, this technology is different from motion capture, because it doesn't turn you into an avatar or an alien — it captures what you actually look like and what you're wearing on the day. so i popped along to dimension studios in wimbledon, whose little green space has been graced by some fairly big faces. who's the most famous person you've had in here? most recently, madonna. really? we also filmed with coldplay and bts. oh, god. drop — drop the names. drop the names. and simon thinks volumetric capture is ready to take on more of the load from the growing trend to hold events in the world of mixed reality. so we're seeing music performances delivered in the metaverse, we're seeing fashion shows where holograms have been captured. and they're being brought
to life on stage as part of the catwalk experience. we're seeing the adoption in film and tv, where movies are using it for crowd scene generation, which has advantages over traditional cg roots, or indeed for the creation of holograms who might represent, you know, a superhero, as part of the next marvel film, for example. i'm surrounded by 53 normal cameras and 53 infrared cameras. now, the infrared cameras capture my geometry, and the normal cameras capture what i look like. and once you've got that, the computer can work out what i look like from every conceivable angle, and that means you can decide on your shot and your camera move after the take. and, of course, you can put your performer, well, anywhere you want. and then you can go to bullet time! nice moves, spen, nice moves. now, currently this kind of stuff still takes a lot of processing to render, a lot of time to manually tidy
up, and a lot of space to store the data — it's ten gigabytes of footage per second. but, as always, this technology is only heading in one direction. so, volumetric video is evolving fast, and the stage you're in today, we have two other stages — we have a mobile truck, which is one of the first of the world of its kind, which is actually an upgrade from the system you see, so we're capturing in ak. and it's on a truck. it's on a truck. it can go to the film studios. mobile, you see? or to a sports stadium or music venue, and it can capture the talent on location. but ultimately we're heading towards a future where this — this technology also goes live and becomes streamable. so, we're working on systems now that will become available in 2022, where you'll be able to capture a holographic or volumetric performance in one location and stream it in real time to another. now, while these 3d creations can be experienced in augmented or virtual reality, spare
a thought for the poor designers, like andy mcnamara, who created sonzai, and who, for the most part, are trying to design immersive environments on flat screens. i mean, obviously we work on a 2d canvas, and i think this has always been the problem of working as a c6! artist or designer. there's been some stuff with vr done in the past, where you can actually view things and look around, but it's still clunky. augmented reality holography is the ultimate way. especially if you can interact with what you're seeing on screen. in the meantime, i asked andy to try out this little holographic display — it's called the looking glass portrait. the device converts a 3d image into dozens of different perspectives and then shows them all at once. but, depending on the angle from which you're looking at it, each of your eyes sees only one of these perspectives at a time, which means, as you move from side to side, the background and foreground move relative to each other, giving a real illusion of depth. this is something called a lenticular screen, which is not, in itself, a new idea, but this device
is the best example of it that i've seen. it can even show a photo taken from your phone and use the extra depth information from your second lens. you could actually see this as being the future of what a lot of people have in their homes in order to show, you know, their relatives from standard photos. because this will take standard photos in a certain format, and it will turn them into a 3d representation of people, so it's an immediate application. i think even for works of art, it's really interesting to see how you could add depth to paintings or give it to a proper fine artist and see what they could do in order to change it into something. yeah. and see what they could do with it and have large on a wall as a display. i think that is a really good application of it. at the moment, the resolution is not high enough for andy to find useful, and, at $319, it is a pretty expensive bit of decor. but, as i say, it does a really good job of creating the illusion of 3d without
the need for a headset. it would certainly be nice not to have to wear the glasses for this kind of experience. yeah, they are chunky, aren't they? so at the v&a, we each wore a hololens, but you've tried a different type of ar glasses, haven't you, from nreal? yeah, that was at ces in 2020. i was impressed with them, i have to say. well, they were quick to sell out when they were released in the us, and chris fox has been testing them for us. but do they deliver the augmented reality future we've all been promised? i've tried a few augmented reality headsets in this job, and these are probably the closest to a practical consumer product i've seen. the glasses have to be tethered to your android smartphone to provide the visuals and also the battery power. and that means you don't have to charge your glasses, which is great, but it will charge your phone's battery a bit quicker. it has built—in speakers and microphones and cameras on the front to track the room so that the visuals all stay in the right place where
they're supposed to. and you have full freedom to get up and walk around and look at things, and the perspective changes as you move. it's really quite cool. the visuals come from two high—definition oled panels in the glasses that are beamed into your eyes via the lenses. and the visuals really do look bright and sharp. it's like a projector is shining images on the walls around you. now, the visuals don't fill your entire field of vision — turn your head, and things start to disappear. nreal says the glasses have a 52—degree field of view, and, in practice, it means whatever you look directly towards is visible, and things start to tail off in your peripheral vision. they've managed to squeeze all of this into a fairly ordinary—looking pair of sunglasses, although whether you think these look fashionable is probably subjective. i've had a few people say these look like dad sunglasses, and also the meme sunglasses. nreal�*s developed an app called nebula, which acts as your home screen, and you use your tethered phone as a remote to point at what you want and tap the screen to click.
and you can launch mixed reality apps or pin web browser windows around your room. i tried this around the kitchen by pinning a virtual recipe to the wall, and then i could look at my cooking and then turn my head and look at the wall and see my recipe. i suppose it was a little bit more convenient than having to check my phone and unlock my phone when i've got raw ingredients on my hands. you can also pin a huge virtual display on the wall so it looks like you're watching tv or a film on a huge cinema screen. although i already have quite a big tv in my living room, and if i'm somewhere like the hotel or train, i can usually get byjust fine watching tv on my phone. there are obviously some games to play. one is this ghost—shooting game where the ghosts are overlaid on your real living room. and another app turns your desk into a tower defence game, so you can walk around and see it in 3d. these were definitely very cool to try, and i think anyone
who has a go at this would say it was very impressive as a tech demo. i don't know how often you would go back to these games. for me, the real potential in this lies in productivity. imagine being able to go to a coffee shop, putting these on, and seeing three huge virtual monitors so you could do some video editing without needing to bring a display with you. the closest i got was plugging this into my laptop and the glasses then act as a second monitor, so i can look down at my laptop to see my timeline and then look up into the display and see the footage i was editing on a huge screen. even though you can move around the room wearing these, they still feel like quite a stationary experience to me. i don't think you would want to wear these on your commute or while you're out shopping. they're not the kind of glasses where you are going to see notifications flying in, or see real—world maps and directions superimposed on the street in front of you — it's just not that kind of augmented reality experience. although it's an idea of what we might get in the future.
for that to happen, i'd expect these glasses to get lighter and obscure less of your field of view. you'd need to get rid of this tether, and you don't want to use the phone as a controller, you want some hand recognition, which nreal is working on. once some big companies like apple and facebook enter the space and throw their big apps onto the platform, you can really see the potential. but as an early entry into the space, it gives you a flavour of what the technology can do. i don't know about you, but i do think augmented reality is pretty much there now, which means in the future the 3d sculptures that you see around you might not actually be in the room. true. it's not quite the same, though, is it? you can't touch it. not that you could touch that, obviously! no, for goodness' sake. don't touch it, we'll all be in trouble. listen, there's one thing about ar they still have to sort, though. what's that? it's the ar groove and ar hair. oh no, i haven't still got that mark on my forehead, have i? yeah, and my hair's all over the place. once you've sorted that, we're in. anyway, that is it from us from the v&a in london.
as ever, you can keep up with the team throughout the week on social media. find us on youtube, instagram, facebook, and twitter, @bbcclick. thanks for watching. we'll see you soon. bye— bye. good afternoon. a north—south divide continues for the rest of the day. it's likely we will continue to see rain through northern ireland, northern england and into much of scotland. it stays cold on the far north. furthersouth scotland. it stays cold on the far north. further south it's a damp afternoon but noticeably milder, temperatures peaking into double figures. through the night, we could see the rain turning to snow above 200 metres in scotland. behind it, it's a drab, dreary night with
patchy mist and fog forming, still relatively mild here but fog could be an issue first thing on christmas morning. if you are off early to visit friends and family, it's worth bearing that in mind. it will lift to low cloud to a rather dreary christmas eve for many. the rain becomes light and patchy in scotland but through southwest england, wales and eventually into northern ireland, we will see some wet and windy weather arriving here. milder into the south, staying colder in the north.
this is bbc news broadcasting to viewers in the uk and around the world. i'm martine. our top stories... scientists cautiously welcome uk studies suggesting that the highly—contagious omicron covid variant is milder than previous versions. britain's health secretary warns that the sheer number of infections could still lead to hospitals being overwhelmed. we do know with omicron that it does spread a lot more quickly, it's more infectious than delta, so any advantage gained from reduced risk of hospitalisation needs to be set against that. new south wales proposes to charge unvaccinated people for covid medical costs — the doctors' union says it's unethical.