unreal engine

Projects Update 1 (Mar 2023)

Tracy Harwood Blog March 20, 2023 Leave a reply

This month, we have two weeks of projects to share with you. This week, we focus on the Unreal film projects we found. The breadth of work folks are creating with this toolset is astounding – all these films highlight a range of talent, development of workflows and the accessibility of the tools being used. The films also demonstrate what great creative storytelling talent there is among the indie creator communities across the world. Exciting times!

NOPE by Red Render

Alessio Marciello’s (aka Red Render) film NOPE uses UE5, Blender and iClone 8 to create a Jordan Peel inspired film, released on 11 December 2022. The pace and soundscape are impressive, the lucid dream of a bored schoolboy is an interesting creative choice, and we love the hint of Enterprise at the end! Check it out here –

The Perilous Wager by Ethan Nester

Ethan Nester’s The Perilous Wager, released 28 November 2022, uses UE’s Metahumans in our next short project pick. This is reminiscent of film noir and crime drama, mixed with a twist of lime. Its a well managed story with some hidden depths, only really evidenced in the buzzing of flies. It ends a little abruptly but, as its creator says, its about ideas for larger projects. It demonstrates great voice acting and we also love that Ethan voiced all the characters himself, he said using Altered.AI to create vocal deepfakes. He highlights how going through the voice acting process helped him improve his acting skills too – impressive work! We look forward to seeing how these ideas develop in due course. Here’s the link –

Gloom by Bloom

Another dark and moody project (its also our feature image for this post), Gloom was created for the Australia and New Zealand short film challenge 2022, supported by Screen NSW and Epic. The film is by Bloom, released 17 December 2022, and was created in eight weeks. The sci-fi concept is great, voice acting impressive and the story is well told with some fab jumpscares in it too. The sound design is worth taking note of but we recommend you wear a headset to get the full sense of the expansive soundscape the team have devised. Overall, a great project and we look forward to seeing more work from Bloom too –

Adarnia by Adarnia Studio

Our next project is one that turns UE characters into ancient ones – a slightly longer format project, this has elements of Star Wars, Blade Runner and just a touch of Jason and the Argonauts mixed together with an expansive cityscape to boot. Adarnia is a sci-fi fantasy created by Clemhyn Escosora and released 19 March 2021. There’s an impressive vehicle chase which perhaps goes on just a little too long, but there’s an interesting use of assets that are replicated in different ways across the various scenes that is brought together nicely towards the end of the film. The birdsong is a little distracting in places, one of those ‘nuisance scores’ we highlighted in last week’s blog post (Tech Update 2). There’s clearly a lot of work that’s gone into this, and pehaps there’s scope for a game to be made with the expansiveness demonstrated in the project, but the film’s story needs to be just a little tighter. We guess the creators agree because their YouTube channel is full of excerpts focussing on key components of this work. Check out the film here –

Superman Awakens by Antonis Fylladitis

Our final project for this week is a Superman tale, created by VFX company called Floating House. The film, released on 13 February 2023, is inspired by Kingdom Come Superman and Alex Ross and is a very interesting mix of comic styling and narrative film, with great voice acting from Daniel Zbel. Its another great illustration of the quality of the UE assets for talented storytellers –

Next week, we take a look at films made with other engines.

Tech Update 2 (Mar 2023)

Tracy Harwood Blog March 13, 2023 Leave a reply

We’ve seen a number of tech developments in recent weeks that we’ll share in this post. Everything from free tools, great content packs, wrinkles for those of a certain age of course, mocap for newbies, nuisance scores, heads up on a lightweight headset, and more!

Lights, Camera, Action

A member of Chantal Harvey’s popular Machinima Mondays‘ Facebook Group posted a video recommendation by Kevin Stratvert of five free screen recording tools that all machinima and virtual production folks should have in their applications folder. He usefully goes through the process of using each of them in his tutorial here –

We highlight just a few of the exciting things we’ve seen in the last few weeks for Unreal Engine. A show and tell tutorial on making ragdoll puppets, reported in 80.lv, featuring 3D artist and animator Peter Javidpour, gives a great breakdown of the process, including how to rig the virtual camera. The process using Blueprints was used in his recent short release, My Breakfast with Barf, link here –

Also using Blueprints, Machina-Infinitum.com released a content pack for making procedural fractals. They look really beautiful – and perfect for that next cyberpunk-cum-inceptionist film. The pack isn’t free at $99, but it looks like a good investment, available on the Unreal store here. Here’s a link to their YouTube channel and tutorials for using the assets –

And also not free (£170.70), another excellent content pack. This one contains realistic building assets from what looks like the Whitechapel area of London, called a British City Pack, by Polyspherestudio.com. Here’s an overview on their YouTube channel –

Reallusion released a much awaited update to its Character Creator, introducing a dynamic wrinkle system. The plasticity of facial animations using CC4 is something we’ve often found ourselves commenting on in our film reviews, and this is a very interesting development. Check out the overview here –

Plask’s mocap app has been upgraded. This is an app we’ve mentioned before, which allows you to record, edit and animate projects in your browser. For pros, there’s a monthly fee, but for newbies, its freemium model looks like a great way to get started in mocap. Here’s an overview of it from their YouTube channel, which also contains tutorials of how to integrate the content with platforms like Blender, Unreal Engine and others –

With interoperability at its heart, ReadyPlayerMe is going from strength to strength. Its recent blog post sets out its ambition, and this highlights what great potential its avatars have to be cross-platform virtual storytellers, although as yet we’ve not seen much of that emerging.

For sound design tips, you can do no better than take a look at REAPER. Anne-Sophie Mongeau has written a great two-part article on Asoundeffect.com, which is definitely worth checking out, and whilst you’re there, you can check out the massive curated collection of sound effects on the website too.

For those exploring immersive experiences, we found another great article on Asoundeffect.com, this one discussed the impact of ‘nuisance scores‘ on the listener – we certainly have some experience of that in films we see too.

And for those seeking an alternative to the wearying headsets for virtual reality immersive experiences, Bigscreenvr.com‘s new system looks very impressive. Its just 127 grams and with a great resolution – most headsets weigh in around 450-650 grams, which is roughly a bag of sugar for those home chefs in the know – so surely will be much more usable than the current techs. It just released an overview of the new set and shipping begins in Q3 2023, and I’m more than tempted to get my order in early on this one…

You’re Welcome!

Finally this week, the Second Life endowment for the arts process is changing. For years, Second Life has been a massive advocate for its community of content creators, and the changes which give creators more time to develop their builds is another example of its fantastic support (notwithstanding the truly err colourful gif on its announcement page, our feature image this week). Here’s a link to its grant page.

S3 E67 Film Review: The Gateway (March 2023)

Tracy Harwood Podcast Episodes March 2, 2023 Leave a reply

This week’s film review is The Gateway by Ritualin Films, made with Unreal Engine and released in May 2020. It is a mix of some well-known alien life sci-fis we all love, the pacing is nicely done and there are some interesting effects that we comment on although we’re not quite sure what the ending is all about.



YouTube Version of this Episode

Show Notes & Links

Film –

VFX breakdown of the film –

Ricky’s sci-fi film recommendation, JUNG_E directed by Yeon Sang-ho. Here’s the trailer –

Projects Update: Feb 2023

Tracy Harwood Blog February 20, 2023 Leave a reply

This week’s #MondayMotivation post has some more great examples of machinima and virtual production projects. We have a selection of shorts made using Unreal Engine and another entirely made in Blender, plus a couple of ‘making ofs…’ and a ‘role of…’ also worth checking out.

Projects

An artist we’ve talked about before who has created extensive work over many years in Second Life is Bryn Oh, and now she has created a nostalgic experience called Lobby Cam which is available on Steam, made using UE5. The experience is a walking tour of an extensive environment and a story told through the pages of a ripped up diary. The project has been reviewed by James Wagner Au on his blog, New World Notes here. It is described as part of a larger narrative, and here’s a video sampler of the tour produced as part of that too… an interesting approach to virtual storytelling –

Off planet, another project which contains amazing detail of other worlds is by Melody Sheep, called The Sights of Space: A Voyage to Spectacular Alien Worlds (released 29 Nov 2022). This a 30 minutes-long film of speculative depictions of space scenes based on ‘current scientific understanding’ of the Milky Way, albeit with extensive creative license. If you ever wanted to get into a new type of documentary, this is probably the one to have on your watchlist –

We were also thrilled to see what promises to be a very interesting new series launching later this year by Melody Sheep, called The Human Future, check out the trailer on the channel here.

In our next project pick, called JOYCE by GTshortStories (released 14 Dec 2022), UE5 and every available tool with it has been used to create an interesting space story. This mixes live action with some well done animation, and the integration is done really well, so its a great example to check out. Joyce is a backchatting robot exploring a facility along with Sargeant Terry Brown – there are many references to popular sci-fi tropes, so do check this one out! GTshortStories is also putting out other creative content, so check out the channel too.

Our final space project for this week is Countdown, by Andrew Klimov on the CGChannel. This is a fast paced story of a crash landing onto an alien planet, all about the crash itself, and it certainly makes you feel it. The crash is the beginning of a new series and you can find out more about that on his website here. There’s also an interesting breakdown of the filmmaking process on his Vimeo channel here.

Our next project pick goes back to the 11th Century, inspired by an Umbrian folk tale in the novel ‘E poi si fece buio’ by Matteo Bebi. It is about a dream by Imiltrude who lived in a hidden village and was sentenced to death for having caused a fire that destroyed a city. The film HIMIL is by Tiziano Fioriti and Andrea Brunetti, made using UE5 and is a fascinating first person perspective with a very well done soundscape –

Our next project pick is a Blender-made movie and another example of great storytelling, this time in a cyberpunk environment with a really nice twist in the tale. Not sure it would be Ricky’s cup of tea, to his point about emotional representation, but I certainly loved it! The story has been created by the Blender HQ team, so its by no means an indie endeavour with a team of folks behind the processes employed but definitely worth watching – check out the pace of the action and sound design in particular. The film is called Charge – Blender Open Movie (released 15 Dec 2022) and you can access the production files and making of videos for the film here.

Making of…

We always love a good homage to Star Wars, and this week we have a feature from the Reallusion Magazine, which describes how iClone and the Vicon mocap system have been used to recreate that iconic ‘I am your father’ scene from the Empire Strikes Back episode. The short has been made by Luis Cepeda from Quitasueño Studios, based in the Dominican Republic, and he provides a great step-by-step guide to how the short was made with a video overview here –

Ever wondered how to use a midi controller with UE5 that lets you use the controller for all sorts of effects in real-time just with the keyboard? Well, here’s a fantastic video tutorial for you by Taiyaki Studios featuring Cory Williams –

Role of…

And finally this week, we share Loralee Sundra’s video on the Internet Archive about the value of public domain films from her perspective as a Frontline Fellow at the Documentary Film Legal Clinic at UCLA School of Law.  Her talk was part of the Internet Archive’s Public Domain Day 2023 celebration, held on 25 Jan 2023.



Tech Update 2 (Feb 2023)

Tracy Harwood Blog February 13, 2023 Leave a reply

This week, we highlight some time-saving examples for generating 3D models using – you guessed it – AIs, and we also take a look at some recent developments in motion tracking for creators.

3D Modelling

All these examples highlight that generating a 3D model isn’t the end of the process and that once its in Blender, or another animation toolset, there’s definitely more work to do. These add-ons are intended to help you reach your end result more quickly, cutting out some of the more tedious aspects of the creative process using AIs.

Blender is one of those amazing animation tools that has a very active community of users, and of course, a whole heap of folks looking for quick ways to solve challenges in their creative pipeline. We found folks that have integrated OpenAI’s ChatGPT into using the toolset by developing add-ons. Check out this illustration by Olav3D, whose comments about using ChatGPT for attempting to write Python scripts sum it up nicely, “better than search alone” –

Dreamtextures by Carson Katri is a Blender add-on using Stable Diffusion which is so clever that it even projects textures onto 3D models (with our thanks to Krad Productions for sharing this one). In this video, Default Cube talks about how to get results with as few glitches as possible –

and this short tells you how to integrate Dreamtextures into Blender, by Vertex Rage –

To check out Dreamtextures for yourself, you can find the Katri’s application on Github here and should you wish to support his work, subscribe to his Patreon channel here too.

OpenAI also launched its Point-E 3D model generator this month, which can then be imported into Blender but, as CGMatter has highlighted, using the published APIs takes a very long time sitting in cues to access the downloads, whilst downloading the code to your own machine to run it locally, well that’s easy – and once you have it, you can create point-cloud models in seconds. However, he’s running the code from Google’s CoLab, which means you can run the code in the cloud. Here’s his tutorial on how to use Point-E without the wait giving you access to your own version of the code (on Github) in CoLab –

We also found another very interesting Blender add-on, this one lets you import models from Google Maps into the toolset. The video is a little old, but the latest update of the mod on Github, version 0.6.0 (for RenderDoc 1.25 and Blender 3.4) has just released, created by Elie Michel –

We were also interested to see NVIDIA’s update at CES (in January). It announced a release for the Omniverse Launcher that supports 3D animation in Blender, with generative AIs that enhance characters’ movement and gestures, a future update to Canvas that includes 360 surround images for panoramic environments and also an AI ToyBox, that enables you to create 3D meshes from 2D inputs. Ostensibly, these tools are for creators to develop work for the metaverse and web3 applications, but we already know NVIDIA’s USD-based tools are incredibly powerful for supporting collaborative workflows including machinima and virtual production. Check out the update here and this is a nice little promo video that sums up the integrated collaborative capabilities –

Tracking

As fast as the 3D modelling scene is developing, so is motion tracking. Move.ai which launched late last year, announced its pricing strategy this month at $365 for 12 months of unlimited processing of recordings – this is markerless mocap at its very best, although not so much if you want to do live mocap (no pricing strategy announced yet). Move.ai (our feature image for this article) lets you record content using a mobile phone (a couple of old iPhones). You can find out more on its new website here and here’s a fun taster, called Gorillas in the mist, with ballet and 4 iPhones, released in December by the Move.ai team –

And another app although not 3D is Face 2D Live, released by Dayream Studios – Blueprints in January. This tool allows you to live link a Face app on your iPhone or iPad to make cartoons, including with your friends also using an iPhone app, out of just about anything. It costs just $14.99 and is available on the Unreal Marketplace here. Here’s a short video example to wet your appetite – we can see a lot of silliness ensuing with this for sure!

Not necessarily machinima but for those interested in more serious facial mocap, Weta has been talking about how it developed its facial mocap processes for Avatar, using something called an ‘anatomical plausible facial system’. This is an animator centric system that captures muscle movement rather than ‘facial action coding’ which focusses on identifying emotions. Weta stated its approach leads to a wider set of facial movements being integrated into the mocapped output – we’ll no doubt see more in due course. Here’s an article on the FX Guide website which discusses the approach being taken and for a wider ranging discussion on the types of performance tracking used by the Weta team, Corridor Crew have bagged a great interview with the Avatar VFX supervisor, Eric Saindon here –