unreal engine

Tech Update 1 (Nov 2022)

Tracy Harwood Blog October 30, 2022 Leave a reply

Hot on the heels of our discussion on AI generators last week, we are interested to see tools already emerging that turn text prompts into 3D objects and also film content, alongside a tool for making music too. We have no less than five interesting updates to share here – plus a potentially very useful tool for rigging the character assets you create!

Another area of rapidly developing technological advancements is mo-cap, especially in the domain of markerless which lets face it is really the only way to think about creating naturalistic movement-based content. We share two interesting updates this week.

AI Generators

Nvidia has launched an AI tool that will generate 3D objects (see video). Called GET3D (which is derived from ‘Generate Explicit Textured 3D meshes’), the tool can generate characters and other 3D objects, as explained by Isha Salian on their blog (23 Sept). The code for the tool is currently available on Github, with instructions on how to use it here.

Google Research with researchers at the University of California, Berkeley are also working on similar tools (reported in Gigazine on 30 Sept). DreamFusion uses NeRF tech to create 3D models which can be exported into 3D renderers and modeling software. You can find the tool on Github here.

DreamFusion

Meta has developed a text-to-video generator, called Make-A-Video. The tool uses a single image or can fill in between two images to create some motion. The tool currently generates five second videos which are perfect for background shots in your film. Check out the details on their website here (and sign up to their updates too). Let us know how you get on with this one too!

Make-A-Video

Runway has released a Stable Diffusion-based tool that allows creators to switch out bits of images they do not like and replace them with things they do like (reported in 80.lv on 19 Oct), called Erase and Replace. There are some introductory videos available on Runway’s YouTube channel (see below for the Introduction to the tool).

And finally, also available on Github, is Mubert, a text-to-music generator. This tool uses a Deforum Stable Diffusion colab. Described as proprietary tech, its creator provides a custom license but says anything created with it cannot be released on DSPs as your own. It can be used for free with attribution to sync with images and videos, mentioning @mubertapp and hashtag #mubert, with an option to contact them directly if a commercial license is needed.

Character Rigging

Reallusion‘s Character Creator 4.1 has launched with built in AccurRIG tech – this turns any static model into an animation ready character and also comes with cross-platform support. No doubt very useful for those assets you might want to import from any AI generators you use!

Motion Capture Developments

That every-ready multi-tool, the digital equivalent of the Swiss army knife, has come to the rescue once again: the iPhone can now be used for full body mocap in Unreal Engine 5.1, as illustrated by Jae Solina, aka JSFilmz, in his video (below). Jae has used move.ai, which is rapidly becoming the gold standard in markerless mocap tech and for which you can find a growing number of demo vids showing how detailed movement can be captured on YouTube. You can find move.ai tutorials on Vimeo here and for more details about which versions of which smart phones you can use, go to their website here – its very impressive.

Another form of mocap is the detail of the image itself. Reality Capture has launched a tool that you can use to capture yourself (or anyone else or that matter, including your best doggo buddy) and use the resulting mesh to import into Unreal’s MetaHuman. Even more impressive is that Reality Capture is free, download details from here.

We’d love to hear how you get on with any of the tools we’ve covered this week – hit the ‘talk’ button on the menu bar up top and let us know.

Fests & Contests Update (Oct 2022)

Tracy Harwood Blog October 17, 2022 Leave a reply

Prazinburk Ridge

Its no surprise to hear that Martin Bell’s Prazinburk Ridge has won its first award, Best Animation – and very fitting that it should be at the North of England’s Wigan and Leigh Film Festival, not a stone’s throw away from Huddersfield, where the main character in the story hailed from. Many congratulations, Martin!

You can see us review the film also on our YouTube channel here –

UE: Creep it Real

Possibly a bit late notifying you but a nice little Unreal contest launched earlier this month – Unreal Challenge: Creep It Real! Here’s the link – deadline is 29 October. There are some great prizes for video content created with the assets you use which is LESS THAN 1 MINUTE, so late as we are posting this, there’s still no excuse for not participating! There were 450 entries to their Better Light Than Never contest, held earlier in the year, so we’re looking forward to seeing the sizzle reel from entries to this one in due course.

Unreal Challenge: Creep It Real

MacInnes Studios’ Dance Challenge

Another contest has launched, hosted by John MacInnes aka MacInnes Studios, and its hot on the heals of his Mood Scene contest, the results for which we look forward to seeing soon. The new contest is all about dance moves – check out the details here – start date is 1st October and it runs for 30 days.

MacInnes Studios Dance Challenge – Oct 2022

and if you want to hear John talk more about his use of avatars and ‘the future of digital humans’, here’s a great webinar you can catch up on too, hosted by Faceware (one of the Dance Challenge sponsors).

Open Calls

There are numerous experimental film festivals that are currently calling for entries – check them out on ExpCinema.org – we liked the look of Underneath the Floorboards!

S3 E48 Film Review: ‘The Eye: Calenthek’ by Aaron Sims (Oct 2022)

Tracy Harwood Podcast Episodes October 12, 2022 Leave a reply

In this episode, we review Tracy’s pick for the month: ‘The Eye: Calanthek’ by Aaron Sims, made in Unreal Engine 5 using Metahuman tech, as an early exemplar of the capabilities of the engine (released 2021).



YouTube Version of this Episode

Show Notes and Links

We discuss the eyes, the monster, the surprise and camera shots.

Time stamps

1:06 Tracy introduces ‘The Eye: Calanthek‘ by Aaron Sims, released 4 November 2021

5:56 What makes it so realistic? The eyes!

11:02 Things that break the storytelling

15:02 Does knowing the craft of filmmaking restrict creative approaches to filmmaking?

Links

Aaron Sims interview with Allan McKay, Ep 364, Filmmaking in Unreal and another with Ian Failes on Befores&Afters

Aaron Sims YouTube channel, with ‘behind the scenes’ reviews where you can leave questions for him to answer

Projects Update (Oct 2022)

Tracy Harwood Blog October 10, 2022 1 Comment

This week’s Projects Update on machinima, virtual production and content creation:

The Crow

One of the most interesting creative projects we’ve seen so far using MidJourney, a creative AI generator is the The Crow (by Glenn Marshall Neural Art). Here the generator has been used to recreate a version of the ballet performance portrayed in the short film, Painted (by Duncan McDowall and Dorotea Saykaly). Stunning to say the least and we recommend you play it at least once side-by-side against the original performance too for added insight.

We’re so impressed with the potential of AI generators, whether that’s DALL-E, MidJourney, Stable Diffusion or any of the others that are now emerging, that we’re going to dedicate a special episode of the podcast to the subject next month, so watch out for that!

Jim Henson Company

Jim Henson Company is using real-time animation on their new show, Earth to Ned. Characters are created with Unreal (its the AI in the background) but JHC has been so impressed with the workflow and no post production requirement that it is looking to use the virtual production method more. What’s interesting is the level of feedback in the process that guests experience – they are not aware of the puppeteering in the background, just the virtual actor on the screen, performing naturalistically in real-time! We’ve not seen much of this kind of machinima before although actually Hugh Hancock did some very early work on this and of course Rooster Teeth have done live performances using similar techniques. We can certainly expect to see a lot more of it, particularly for interactive theatre, VR and AR.

Half Life 3

Half Life 3 was never going to be like the originals? This article on Tech Radar is interesting: the author (Phil Iwanuik) contends the Half Life franchise remakes would never be like the originals because the extreme attention to the world of HL created so much pressure the Valve team could never live up to it. We’re not sure about that, but it’s an interesting idea.

source: Valve

Dune: Awakening

A very impressive MMO has launched using the Dune world, currently in beta, Dune: Awakening. Here’s the trailer – we’re looking forward to seeing machinima made with this –

Dungeons & Dragons?

What does Dungeons and Dragons, typically a game played around a table, have to do with machinima? There’s been a rise in popularity of web based shows where people play the game and act out scenes. This group (Corridor Crew) is using Unreal Engine 5 for virtual production (not quite The Mandalorian but sort of similar) to put their actors, real-time, into the environments of their adventure. Check it out here –

S3 Ep 47 ‘How I learned Unity without following Tutorials’ by Mark Brown

Tracy Harwood Podcast Episodes October 5, 2022 1 Comment

This week, we share feedback on our reviews from September’s selection and discuss a video about learning production techniques, selected by Phil: ‘How I learned Unity without following tutorials’ by Mark Brown.



Youtube Version of this Episode

Show Notes & Links

0:47 Ricky highlights some of the news items we’re released in a series of blogs this month, being released on Mondays, starting with out Tech Update on 3 Oct, then Projects Update on 10 Oct, Fests & Contests Update on 17 Oct and finally a Report on Creative AI Generators on 24 Oct. You can find these on https://CompletelyMachinima.com/Blog

2:19 Feedback on previous episodes – see original episode on our website relating to Prazinburk Ridge, The Inquisitor and person2184 and Facing the Wolf

6:51 How I learned Unity without following tutorials (Developing 1) by Mark Brown on his Game Maker’s Toolkit channel, released 28 September 2021 – discussion, learning by rote vs learning concepts, different learning styles and role of prototyping in machinima and virtual production.

Q: What are your favourite tutorials on machinima making? Get in touch and let us know on our email or other channels