- Select Object in Viewport and Find in Content Browser: Ctrl+B
- Holding Shift while moving a game object moves the camera with it.
- Simple Emmissive Material: Create Material → 3+LMB create 3-channel node → warm up the color → M+LMB create Multiply node → set Const B to 20+ → connect to Emmissive Color.
- Ctrl+L+LMB while hovering over an object in the scene will create a colored light picked from the object's material.
- Build animations inside Sequencer. This is great as I previously relied on 3DS Max and Motionbuilder.
- Sky, cloud, and environment lighting. Author and render realistic or stylized skies, clouds, and other atmospheric effects for any time of day with full artistic freedom.
- New water system. Water reacts with game objects more realistically.
- Production-ready hair and fur.
- Render-to-Movie improvements.
- Remote control API. More LED functionality than anything else, I suppose this would also allow control surfaces to interact with the Engine.
- nDisplay support for large LED volumes. Leveraging the greenscreen-replacement tech popularized in the Mandalorian.
- Collaborative Viewer template. Used to multi-user VR design reviews.
Environment Creation Workflow
- Ask and answer some basic questions:
- Who inhabits the environment?
- What is it used for?
- Where is it?
- When is it used?
- Create a reference board of related images.
- Draw a simple map of the location.
- Distill ref board down to a mood board that conveys:
- Art style.
- Subject matter.
- Choose a general direction to explore. In the case of an environment, perhaps it's lighting, structure, or textures. Don't get too attached to any one idea.
- Refer to project documentation. Look through what's already been written about the environment. If nothing's been written yet, maybe write something?
- Add existing visual elements. Leveraging the work that's been done with the reference images, just copy that collection into a new board and work from there.
- Collect inspiring design. Hunt for abstract representations of what you're looking for. The vaguer the better.
- Add photographs. Just to hammer home the point that the previous step was more abstract.
- Add colors and fonts. Pull from the existing images or find a palette that fits what's developing in this soup.
- Add motion and animation. Having strong, animated references can go a very long way in solidifying a concept.
- Arrange the materials. Determine the order of the references. Size and relationships matter at this stage. Take your time.
- Annotate. Add notes to reinforce and explain why certain elements fit. This provides context for…
- Get feedback. Not as big a concern for the soloist, but even leaving off at this stage for a day or two, then looking again with fresh eyes.
- Establish asset naming conventions:
- SM prefix for static meshes.
- T prefix for textures with appropriate suffixes:
- _D for base color.
- _N for normal map.
- _M for occlusion/roughness/metallic masks.
- M prefix for master materials.
- MI prefix for instanced materials.
- Establish a file hierarchy.
- Create an asset list.
- Draft Rough Unified Model of Environment: work quickly in modeling software to visualize the map, but work smart. Think modular, and plan for future detachment and refinement. Import the UE Manniquin for scale. Ensure Units are set to Centimeters!
- Export as Single Attached Mesh: export to the Meshes folder in the Unreal project.
- Import to Unreal Engine: Deselect Auto Generate Collision, Deselect Advanced → Generate Lightmap UVs. Material: Material Import Method (Do Not Create Material), Deselect Import Textures.
- Walk Through Blockout: Get a sense of the space, check for scale issues.
- Refine Model: Return to modeling software and create a refined version of the space. Break larger areas into smaller chunks for future asset management, eg. rooms divided into walls.
- Export/Import Refined Meshes
- Repeat for Props
- Map pieces like walls, rocks, and any objects not regarded as moveable should all share a common central pivot point in the modeling software. This ensures that they are correctly placed when imported into Unreal. Props should have their pivot points grounded, centered, and balanced on a per-prop basis.
- Cloned objects in the World Outliner will see a global replacement if their base model is overwritten.
- Blocking Volume: Modes → Volumes → Blocking Volume. Useful for roughing out environment collision such as floors.
- Generate Per Mesh: Mesh Editor → Collision → Select Method
- Auto Convex Collision: Simple collision wrap. Refine with Convex Decomposition → Apply. Collision accuracy will vary.
- Generate in Modeling Software (Best Practice): Create simplified version of target Mesh, add UCX_ prefix and _NUM per part, eg. UCX_SM_Chair_01, UCX_SM_Chair_02, etc.
- Create Basic Asset → Material.
- Use M_ prefix for master material.
- T+LMB: Place 3 Texture Sample nodes.
- Connect RGB of Texture Sample 1 to Base Color.
- Connect RGB of Texture Sample 2 to Normal.
- Connect R of Texture Sample 3 to Ambient Occlusion.
- Connect G of Texture Sample 3 to Roughness.
- Connect B of Texture Sample 3 to Metallic.
- Material Expression Texture Base → Texture: Assign 127grey (Engine Content) to Samples 1 and 3. Assign BaseFlattenNormalMap (Engine Content) to Sample 2.
- RMB on Texture Sample → Convert to Parameter. Name the new Param2D nodes Diffuse, Normal, and Mask, respectively.
- Clone this master material for each subgroup of asset, eg. Props, Structure.
- RMB on Material → Create Material Instance. Use MI_ prefix to denote material instance.
- Create folders within the Materials folder for Props, Structure, etc. Create material instances for each asset in their appropriate folders.
- Assign textures.
- Export alpha-pass textures in Painter:
- Unreal Engine 4 (Packed)
- Naming convention reminder: _D for Diffuse, _N for Normal, _M for Mask.
- Import to appropriate Textures sub folder in Unreal.
- For the Mask, edit the imported Texture → Compression → Compression Settings → Masks (no sRGB). Ensure that Texture → sRGB is disabled.
- Edit MI_ → Details → Parameter Groups → Texture Parameters → enable Diffuse, Mask, Normal → assign textures.
- Edit SM_ → Details → Material Slots → assign material.
This pipeline now allows for external texture editing and a simple re-import to change the finished look of meshes.
Place all of the 0,0,0 objects, then work from large to small. Don't be afraid to slightly angle, scale, and offset objects. Make the space look lived in, used. Even at the alpha stage, there's still a ton of room to experiment and add environmental storytelling.
Use a color swatch to drive lighting and mood. Use it to color pick lighting tones. Creating a light-proof "shell" around single-sided meshes will help reduce or eliminate light leak.
- Directional: can mimic sun/moonlight.
- Point: use sparingly. Details → Light → disable Cast Shadows. Spot light is cheaper alternative.
- Spot: workhorse. Overhead lights,
- Rect: 1-sides shaped light, good for TV screen-ish effects or overhead flourescent panels. Use sparingly.
- Sky: ambient light.
Stack Emmisive material, Point and Spot lights to layer and create more realistic lighting.
Lights have various parameters in their Details panel:
- Intensity: overall brightness.
- Attenuation Radius: depth of the light cone.
- Light Profile: adds real-word physical light model.
- UVs should have plenty of padding to avoid overlapping light bleed in baked results.
- Use a 2nd UV channel if necessary.
- The lightmap size is set on a per-mesh basis, found in Details → General Settings → Light Map Resolution.
- Alt+0 reveals light maps in scene. Green indicates completed maps. Adjust the lightmap sizes for all objects to produce even texture.
- Check World Settings → Lightmass → Lightmass Settings for more variables to play with. Particularly, enable Use Ambient Occlusion and Generate Ambient Occlusion Maps, and tweak Num Indirect Lighting Bounces and Num Sky Lighting Bounces.
- Lightmass Imporance Volumes: ensure this is in the World Outliner, and is encapsulating the entire scene to be lit.
- Lightmass Portal: use these in openings where light comes into the scene, such as doorways and windows. Shape accordingly.
- Reflection Captures: Modes → Visual Effects → Sphere/Box Reflection Capture. Generously place these wherever reflections will be expected. Also modify Project Settings → Engine → Rendering → Reflections → Refelction Capture Resolution for increased detail.
Post Process Volumes
- PostProcessVolume: adds global visual effects. Experiment with settings, have fun.
- AtmosphericFog: adds fog. Experiment with settings, have fun.
Blueprint to C++
- Hover over a term in Visual Studio and press F12 to go to the definition.
- Quick to compile and iterate.
- Easy to understand (visually) and easy Unreal API discovery.
- Designer and artist friendly.
- Performant, but may not necessarily be the bottleneck in a project.
- Full access to the engine.
- Works well with Version Control.
- Compact and readable (assuming the reader is familiar with the syntax).
- More maintable at scale.
Creating C++ Base Classes
- C++ has no knowledge of the Blueprint system.
- Blueprint must always inherit from C++.
- Blueprint can interact with C++ and other Blueprints, but C++ can only interact with other C++ classes.
- C++ can make runtime calls to Blueprints, but only to interfaces in C++.
- Check parent class type: Open target Blueprint → Class Settings → Details → Class Options → Parent Class.
- File → New C++ Class… → of matching type. Use unprefixed matching name.
- Check .cpp and .h files in Visual Studio. For Scene Component, the UCLASS in the .h may require Blueprintable.
- Build project in VS.
- Change parent class type to newly-created cpp.
- Log statement: UE_LOG(LogTemp, Warning, TEXT("WARNING_TEXT")); LogVerbosity.h for detailed list of warning types.
- Window → Developer Tools → Output Log: view the log warnings.
- PrimaryActorTick and PrimaryComponentTick (.bcanEverTick=boolean;) control whether or not the function is polled every frame.
Actors vs. Components
|Can be placed in levels
|Can have components
|Default variable values
UPROPERTY: Exposing Variables
Migrating variables from Blueprint to C++ while maintaining Blueprint functionality.
- Add a protected method inside the .h class.
- Add preprocessor directive UPROPERTY(AccessType, BlueprintReadType) to annotate the variables. This exposes the variable to Blueprint.
UPROPERTY Access Types
- Add variable(s) formatted as C++ standard: type VariableName = value.
- Build in Visual Studio.
- Compile in Unreal Engine.
- Replace any existing references to the old Blueprint variables (these will have the _0 suffix, search by selecting and Alt+Shift+F) with the new C++ variable, then delete the _0 Blueprint variable.
Exposing C++ functions to be called from Blueprint.
Mapping Blueprint to C++ Types
Basic Steps for Pure Function
- Open target .h file.
- Write UFUNCTION(BlueprintCallable, BlueprintPute) under BeginPlay() call.
- Follow with CPPType FucntionName() const;
- Hover over function name until tool appears, then Copy signature to clipboard.
- Open .cpp file (Ctrl+K, Ctrl+O).
- Paste below TickComponent.
- Translate the Blueprint script into C++ code.
- Build and compile.
- Delete the now-legacy Blueprint function.
Tracing Unreal Engine C++ API Names
- Hover over Blueprint Node in question, look for "Target is..."
- Ctrl+, in Visual Studio and enter target name +.h for header lookup (eg SceneComponent.h).
- Ctrl+F Node name in header file (eg GetWorldLocation).
- F12 on Unreal function call to find return value (eg K2_GetComponentLocation() returns GetComponentLocation()).
2015 Legacy Notes
- RC: mouse Right-Click.
- DC: mouse Double left-Click.
- MO: mouse over.
- BPE: BluePrint Editor.
- SME: Static Mesh Editor.
- PBR: Physically-Based Rendering.
- PPV: Post Processing Volume.
- PhAT: Physics Asset Tool.
- 1 Unreal Unit = 1 centimeter.
- ` (tilde): swaps between world and local axis.
- End: drops selected pawn down to "floor" (top of underlying mesh).
- LC Component Transform → Location: alternates between absolute and relative to parent.
- Shift+Move object: moves camera at the same time.
- Viewport Options → Lock Actor to Viewport: orients camera to selected actor (light, other in-scene camera) for 1st person direct aiming.
- Lockable Content Browser: clicking the lock icon in the top right corner of the CB will cause a new instance to open the next time an asset is browsed-to (CTRL-B). Up to 4 CBs can be opened this way.
- CTRL+E: Open BPE for selected Actor.
- RC Content Folder: set folder color.
- F8: jump out of the character control while the game is running.
- ` - toggle console.
- Show Collision: display wireframe collision meshes.
- r.Cache.DrawLightingSamples 1: show static light samples.
- ShowFlag.VisualizeMotionBlur 1: show motion blur fields.
- HighResShot: take a high resolution screenshot. Incompatible with Post Process Temporal AA, turn this off or use FXAA instead.
- RestartLevel: reloads the current level.
- Stat soundwaves: list the playing Soundwaves and their volumes.
- Stat soundcues: list the playing SoundCues and their volumes.
- Stat sounds –debug: sound asset must have 'debug' set to true for attenuation settings to display.
- Stat sounds off: remove debug stats.
- Stat reverb: list active reverb effects.
- Stat soundmixes: list active SoundMixes.
- Log logaudio verbose: logs most audio events.
- Log logaudio veryverbose: logs everything audio to .log file.
Asset Naming Conventions
- P_: particle system.
- SM_: static mesh.
- S_: sprite.
- SK_: skeletal mesh.
- _Physics: physics asset.
- T_: texture.
- _BP: Blueprint.
- M_: Master Material.
- MI_: Material Instance.
- To create custom collision, build the mesh inside 3DS Max, assign it an appropriate name (see next) and include it during FBX export.
- The naming convention goes as follows:
- UBX_[RenderMeshName]_## for box colliders.
- USP_[RenderMeshName]_## for sphere colliders.
- UCX_[RenderMeshName]_## for convex colliders.
- Morph target reference.
- Materials can imported with the model and automatically assigned.
- 2020: Better to learn new Chaos system.
- RC on the target mesh in the Content Browser → Create Destructible Mesh.
- DC the _DM mesh.
- Adjust variables.
- Implement _DM mesh in Level, enable on Component - Physics → Simulate Physics.
Blueprint Notes (Bolded = Node Names)
- Alt+LC on Connected Pin: snip connection(s).
- Map Range: clamp values based on input min/max, scale to new output min/max.
- Vector Length: casts input to vector.
- RC collect Vector pins to split them.
- Set a value then RC the input pin to Promote to Variable with the set value as default.
- Line Trace by Channel: similar to Raycast, the Out Hit pin can be broken to delinate hit results.
- Click the Eye icon next to any Blueprint Variable to expose it on the Component in the Editor.
- Using too many heavy Event Tick-driven actions can lead to poor optimization, much the same as too many Update() methods in Unity.
- Branch: conditional statement, true/false boolean operator.
- Get Owner: this.GameObject() in Unity.
- Clamp (Int): establishes range of public variable.
- Find Look at Rotation: noted for usefulness.
- OnComponentBeginOverlap/EndOverlap: functional equivalents to OnTriggerEnter/Exit.
- Toggle: sends a simple boolean message to Target Blueprint's Event Toggle node.
- Unrotate Vector: noted for usefulness, gets inverse of input vector.
- Switch Has Authority: check if the script is being run on the server (Authority) or on a client (Remote).
- Attach To: sticks one _SM to a socket in target _SM. Also used for Parenting, similar to Unity.
- Detach from Parent: as it says.
- Reroute: organizational node, creates spline vertex. Can be used to join like outputs into a single wire (e.g. multiple Sets)
- Load Stream Level: as it says. Combine with Level Streaming Volume to control visibility of load.
- B+LC: draw Branch node.
- S+LC: draw Sequence node.
- Actor Has Tag: boolean check for tag.
- Plugging an Enum into a Select node will populate the selection pins with the enum values. Connecting a variable to any new pin will update the others to accept the first's data type.
- Blueprints built off of the Construction Script seem to function similar to Editor scripts in Unity: whatever's in there happens in-editor while the level is being built, as opposed to constructing something from an Event Begin Play node in the Event Graph, which operates more like a Start() function and builds the object on play. Note: construction scripts fire whenever an actor is manipulated (translated, rotated, scaled, or variable-adjusted).
- Variables can be dragged directly onto pins for quick wiring.
- Construction Blueprints (my term for a Blueprint focused on building in editor via Construction Script) are particularly well-suited to instancing procedural content. Stairs, fences, anything that requires orderly or random array distribution. This would be accomplished through Editor scripts in Unity.
- To make a vector variable visible in the editor, enable Show 3D Widget by clicking on the variable → Details.
- Example 1.3 (in the 2014 Example Project) will be useful for Lazy Susan Inventory System.
- Example content showing different methods of triggering effects and interactions. Of note:
- 2.3: Spawn Blueprint and reset with Dispatch Event.
- 3.2: Various 1st person interactions (levers, cranks, pulls).
- 3.3: Material changes based on inputs!
Blueprint Mouse Interaction
- Point-and-Click Interaction as driven by Blueprint:
- Get Hit Result Under Cursor by Channel: a node that pretty much encapsulates screen-based raycasting. See also Get Hit Result Under Finger… to take touch-based interfaces into consideration.
- Break Hit Result: extracts the data from a hit for easy wiring, e.g. Location, Normal, Hit Actor, etc.
- Wire off the result of a detected hit from the above chain into whatever logic.
- To get feedback from mouseover, use OnBeginCursorOver (Static Mesh) and OnEndCursorOver (Static Mesh). A useful node to wire from the Begin is Set Current Mouse Cursor, which can select from an enum of default cursors. Currently investigating how moddable this node is.
- While possible to draw and change a HUD completely through Blueprint, it seems to me that more often than not this type of interaction is handled best through UMG UI.
- To link Blueprints to UMG activity:
- Bind target UMG fields and create variables within bindings.
- If you want to pull a value from an external Blueprint, in the bind Get the target Actor, cast it, then wire off the external variable to the return value.
- Create target Get UMG Widget as variable in sending Blueprint.
- Wire off from Get to expose target variable in UMG Widget, can now be Get and Set.
- Unreal splines can be used for a variety of effects:
- For pathing, such as with particles or static meshes.
- Spacing and oriening static meshes, for chains and iterative shaped patterns.
- Bending static meshes.
- Moved via timeline animation.
- The splines can be created and edited in the editor viewport (scaling expands the lengths of the tangents in stereo) but being able to edit and import splines in external packages like Illustrator would be a god-send. I wonder if the inability to break tangent handles has something to do with that? Either way, I hope they're working on improving that, as more customizable and fluid spline creation would lead to more creative work in engine.
- Every example in this level of Content Examples is useful.
- Replication → Replicates (bool): any Actor with this set to true will be visible to client connections, and vice versa. Likewise, individual variables have a Replication field in their settings.
- Switch Has Authority node can be used to run different logic for client/server users, render completely different scenes, enable different Pawn powers, etc.
- Server can take computational load and distribute results to clients via enabling of the RepNotify property on variables.
- Custom functions also have replication properties:
- Reliable/Unreliable: unreliable calls can be dropped during heavy server load.
- Multicast: called on server and forwarded automatically to client.
- Server: executed on server only.
- Client: called by server and executed on owning client.
- Note that Server/Client replication functions can only be used on actors what have a Net Owner, i.e. are player-controlled or owned by a player controller.
- One key aspect for optimizing performance of network play is deciding what events are seen by which players, known as Relevancy. This is handled at a basic level by setting the Net Cull Distance Squared as part of the Actor's Details → Replication in Blueprint. Depending on replication settings, actions may be invisible to clients beyond this radius.
- RepNotify on event can be used to store the conditions of events that are culled from clients, however they are imperfect as they will fire the event when clients cross the cull boundary.
- Typically, hybrid solutions that divide effects and stored conditions are needed for optimal performance.
- Heightmaps of up to 8192x8192 are feasible.
- Landscape Splines can be leveraged to quickly create roads, walls, or any other linear features.
- Foliage paints sets of Static Meshes on Landscape Actors or other Static Meshes using hardware instancing, grouping them into single draw calls. Each SM assigned to the Foliage brush has customizable procedural generation properties.
- Timeline/keyframe editing tool for controlling cameras, lights, and events to create cinematics or directed sequences.
- Will require a more thorough investigation for better understanding.
- Note: highly useful for recording demo content.
- RC texture asset → Create Sprite: convert texture to sprite.
- RC texture asset with sprite sequence (atlas) → Extract Sprite: pulls all individual sprites from a sheet into the Content Browser. Note: resulting sprites are numbered based on sequence from top left of sheet, and can be easily marquee selected and converted into Flipbook, UE4's sprite animation editor.
- Custom collider shapes are easily defined through the editor, which will first create a "best fit" box collider to which custom vertices can be added and manipulated for a tighter fit.
Lock sprites to Y and enable physics.
- Sprites and sprite sheets can be pixelated through content settings: NoMipmaps, Filter: Nearest, Compression Settings: Editor Icon.
- Creating platforms in 2D that can be jumped through from below: this video from 1:05:45.
Notes from Video Series
- Material RGB correspond to Transform XYZ. Absolute World Position → Component Mask nodes can be used for masking, in 2D enable only R and B channels.
- LC+1: spawn Material Expression Constant node, RC to convert to Parameter.
- LC+D– spawns Divide node.
- LC+M: spawns Multiply node.
- TextureSampleParameter2D → rename SpriteTexture: create texture node for use with Paper2D
- RC any Texture asset → Sprite Actions → Create Sprite: make sprite from texture.
- In Sprite Editor:
- Edit Source Region: clip to target sprite area.
- Edit Collision: clip collision area.
- Edit Render Geometry: refine clip to exclude artifacting.
- RC selected Sprites or Sprite sheet → Make Flipbook.
- Use PaperCharacter for creating 2D sprite-based characters.
- Set Source Blueprint in Sprite Details to the idle state.
- If the character sprite is too small, resize using the Property Matrix and adjusting Pixels Per Unit (smaller values will make the sprite larger, and vice versa, remember to maintain power of two decimal value).
- In CharacterMovement component, set Planar Movement → Constrain to Plane (Enabled) and Plane Constraint Normal = Y: -1.0 to lock movement to Y plane.
- Add a Camera SpringArm and disable Camera Collision → Do Collision Test as the camera will be flat-on orthographic, and disable Camera Settings → Inherit Yaw to prevent camera flipping.
- Rotate the SpringArm to Z: -90 so it's facing the character.
- Add and parent a Camera component to the SpringArm. Disable Camera Settings → Use Pawn Controle Rotation to prevent rotation by controller input. Set Projection Mode to Orthographic.
- Create a Game Mode Blueprint. In Defaults → Classes set Default Pawn Class to the target character.
- Edit → Settings → Project → Maps & Modes → Set Default Maps (Game, Editor Startup, Server) to created map, and set Default Modes (Default and Server) to created mode.
- In Settings → Engine → Input, define some movement Bindings. Use positive scale for positive direction, negative for negative (e.g. D key with Scale 1.0 to move right, A key with Scale -1.0 to move left).
- A byproduct of working with this tutorial series was an exploration of macro functionality.
- RC in Content Browser → Blueprints → Blueprint Macro Library - Create Macro Library in the Content Browser. This can then be opened and macros created. To access the macros across the project, ensure that Context Sensitive is diabled when using RC to place nodes in a Blueprint.
- On Actor → Physics → Simulate Physics (enabled): enables physics. Note that the Actor's static mesh needs a collision component to work.
- The engine will base skeletal mesh collision based on joints and skin weights of an imported asset, which can then be refined using the Physical Asset Tool (PhAT).
- Radial Force Actor: apply either a Constant Force or Impulse to physically-enabled actors. Impulse is fired via Blueprint, entire force over 1 frame.
- Physics Thruster: component that acts as a "jet engine" when attached to a physically-enabled actor.
- Blocking Volume: used to deflect or block physical movement.
- Physics Constraint: actor component that generates a customizable joint that can be attached to physically-enabled actors to simulate a wide array of joint-based movement (chains, swinging signs, train linkages, etc.)
- Angular Motor: subsection of contraint that, when enabled, will spin and move the selected physically-enabled actor(s) using itself as the pivot/locomotive point. Think of the hub of a helicoptor rotor, or a motorized pivot of a pendulum.
- Linear Motor: a "spring" physics function.
- Linear Position Drive: attempts to obtain a position relative to the constraint position.
- Linear Velocity Drive: applies force/attraction in direction from/to constraint's location.
- Joints can be made breakable by enabling Linear/Angular Breakable under those sections Advanced rollouts.
- Automatic destruction mesh generation, depth of destruction, frame support (think blowing out panes of glass while leaving the frame intact, then letting a stronger force blow the frame out), damage on impact, & of course sounds & particle triggers.
- The workflow is almost identical to 3DS Max Slate editor/Substance node builder.
- HDR is supported (values greater than 1). With overridden values the glow intensity becomes more prominent, as well as any dirt textures that may be applied to the camera.
- Full PBR support.
- White in an emissive mask is visible, black invisible.
- Black in an opacity mask is transparent, white is opaque.
- Order of importance in Material setup: BlendMode → LightingModel.
- Translucency lighting modes:
- Non-directional (volumetric, no normal).
- Directional (volumetric, with normal).
- Surface (lit, reflective with nearest Reflection Capture Actor).
- Volumetric Directional allows the setting of Lighting Intensity.
- Translucency materials with Two Sided enabled are more expensive.
- Material Details → Lightmass → Lightmass Settings → Cast Shadow as Masked (Enabled): bakes any shadows cast by material into lightmap.
- Separate Tranlucency (Enabled): will not be affected by Depth of Field effect.
- Material Domains:
- Surface: default, used for any kind of geometry.
- Deferred Decals – used with Decal Actors to create a patch that can warp over geometry Uvs (think splatter, puddles, graffiti).
- Light Function: modifies the projection of lights (cookies, can be animated).
- PostProcess: use for blending effects from inside PostProcess Volumes.
- Material → Shading Model → Subsurface: enables subsurface scattering driven by opacity mask and defined by a subsurface color map.
- Material → Blend Mode → Masked: used for hard clipping, for achieving low-cost see-through materials like chain-link fence, gratings, and grilles.
- Opacity Mask Clip Value: determines where the clipping takes place. Clever use of a greyscale mask can create unique and interesting effects when animating this value.
- World Position Offset: manipulates mesh vertices by Material. May cause shading errors, correcting those errors may result in performance hits. Useful for ambient animation.
- Tesselation: D3D11 feature, requires a height/displacement map & can produce performance issues if pushing vertices out of bounds.
- Refractions: Documentation reference 1, Documentation reference 2.
- Instanced Materials: construct main material, use Parameter nodes for wiring to output, RC in Content Browser → Create Material Instance. You can create instances of instances.
- Material Functions: convert a Material to a function using Make Material Attributes node, then wiring off a FunctionOutput node. RC Output → Start Previewing Node for real time preview.
- Layered Materials: using input color masks on alpha channel in MatLayerBlend_<type> nodes and blending (standard Substance workflow).
- Vertex Animation - Futher research required.
- Rotator/Panner: Coordinates nodes that produce animated effects in the material.
- DitherTemporalAA: anisotropic translucency and brushed metal effects with material masks.
- Material Details → Material → Material Domain → Deferred Decal: enables material as projectible decal.
- Decal Blending modes:
- Translucent: can project normal maps.
- Stain: transparent, no normals.
- Normal: standard opaque, with normals.
- Emissive: glow effect, no normals.
- Use an Opacity texture to mask the decals.
- Static Mesh → Recieves Decals (Disabled): decals will not project onto these meshes.
- Actor → Decal → Sort Order: higher will project closer to camera.
- Decals can be animated just like any other material.
- Point Light: emits light in a sphere (visible when selecting the component in editor.
- Spot Light: emits light in a cone shape defined by two cones. A penumbra is defined by the falloff between the Inner Cone Angle and Outer Cone Angle.
- Directional Light: emitted from an infinitely large source an infinite distance away, ideal for simulating the sun. Casts parallel shadows.
- Static Light: used for baking shadow information into Lightmaps. Bakes shadows only from meshes marked Static, and contained within Lightmass Importance Volumes. Produces area shadows based on Source Radius property.
- Lightmap Resolution on BSPs determines quality of baked shadows on individual meshes, smaller is denser/higher quality. Similar function on SM through Override Lightmap Res, but higher is greater quality.
- Static lights can also affect dynamic objects via the Indirect Lighting Cache Sample Grid, yet this effect will not cast accurate shadows.
- Static lights do not support light functions.
- Static lights may use IES lighting profiles.
- Stationary Light: intended to stay in one position, and can change their brightness and color during run-time. Brightness changes only affect direct lighting as indirect lighting and shadowing is baked into Light/Shadowmaps.
- Stationary lights support IES profiles.
- A maximum of 4 stationary lights can overlap at one time. This may be visualized through Stationary Light Overlap mode in the View menu.
- Moveable Light: can be moved and changed in game. Used for headlights, flashlights, swaying lamps, etc. Completely dynamic, no indirect lighting. Shadow casting from ML can cost 20x more than other lights.
- Moveable lights support IES lighting profiles.
- Lightmap density may be visualized through the View menu.
- IES Lighting Profile: simulates real-world light via texture applied to spot or point light. Spot lights will clip any of the profile beyond the 180 degree max arc; point lights transform into spot lights with the entire IES profile rendered.
- Light Function: Material applied to light to alter light's intensity. Can create hard, moveable cookie-like effects. Significantly more expensive to render than an IES profile, but allows for complex shapes. Only intensity can be modified by a light function.
- Falloff Exponent: found under Light Properties → Light → Advanced rollout, defines rate at which light loses energy over distance. Creates exponential falloff, best used for stylized visuals.
- Inverse Square Falloff: when enabled produces realistic falloff, with light brightest closest to source then rapidly dimming over distance. When enabled the Brightness value of a light is measured in lumens, and will require substantially high values to be rendered.
- Indirect Lighting: bounced light will be rendered most accurately on static objects, when baked into lightmaps. It will still have some effect on dynamic objects, but is determined by the light cache, and will not produce accurate shadows.
- Source Radius: simulates the size of the light source. Smaller values produce sharper shadows. The distance from the light also determines shadow sharpness: further is softer. Final shadow results depend on lightmap resolution. Example usage: simulate indirect skylight through a window.
- Source radius affects light reflection in an almost 1:1 correlation in mirrored size.
- Source Length property simulates tubular lighting.
- Min Roughness property can be used to manually soften a light reflection. Typically the roughness value of a PBR texture should be producing accurate light reflections.
- Shadow Bias changes the accuracy of mesh self-shadowing. Lower is more accurate, but will introduce rendering artifacts.
- Shadow Filter Sharpen can artificially boost the sharpness of the edges of shadows. Only works on Moveable or Stationary lights cast on dynamic objects.
- Post Process Volume → Unbound (Enabled) – creates global post process effects. Be careful, if the general scene is rendering weirdly it may be due to this enabled somewhere.
- Film: used for processing scene color (tinting shadows, color mixing channels, saturation, contrast, crushing shadows and highlights, and dynamic range).
- Scene Color: vignetting, chromatic aberration, tone mapping, and global scene tinting. Note: this is the function that makes use of the ability to apply layer blending effects in Photoshop and exporting an LUT.
- Bloom: brightens brightness, also controls the intensity of an applied dirt mask (if any) when looking at over-bright objects.
- Ambient Cubemap: uses a panoramic texture to light the scene, much like the PBR viewport in Substance. Analogous to "environmental lighting". Does not cast shadows, should be used in conjunction with proper lighting techniques.
- Auto Exposure: simulates "eye adaptation" as when moving between light and dark environments. Expand the advanced rollout for a host of tweakable variables.
- Lens Flare: simulates lens imperfections. Image-based and customizable. Radius increases draw cost.
- Ambient Occlusion: screen space ambient occlusion (SSAO) "darkens the cracks". Use judiciously as most PBR should already provide decent AO. Tweak on a cosmetic pass.
- Screen Space Reflection: when enabled turns on very simple reflection from Materials. Possibly one of the most desired functions I wanted from Unity.
- Global Illumination: provides control over the color and intensity of indirect light coming from Lightmass. Useful for adding appropriate atmosphere to a scene (time of day, emotional overtones).
- Depth of Field: applies blur based on distance in front of or behind a defined focal point.
- Gaussian: fast, standard, good for real-time gaming application.
- Bokeh: shaped blur, more expensive, better for cinematic presentation.
- Motion Blur: self explanatory. Good to note that this is not a global default, and needs to be enabled/configured on a case-by-case basis.
- Screen Percentage: supposedly used to down/upsample the resolution, possibly broken in 4.7.2.
- AA Method: anti-aliasing method: none, FXAA, or Temporal AA.
- Blendables: postprocessing can be modified through materials. This feature appears to be experimental/under construction at this time.
- Priority: nested volumes with higher priority will be used.
- Blend Radius and Weight: used to create "feathering" in a PPV, the intensity of the volume increases the closer the camera gets to the center, weight is used to balance between more than 1 overlapping PPV.
Particle Systems (Deprecated by Niagara?)
- RC under Particle Emitter → TypeData → New GPU Sprites: uses GPU sprites.
- RCuPE → TypeData → New Mesh Data: enables instancing of mesh. Note: setting Lifetime to zero allows the mesh instance to live forever. Good for creating visual anchor to particle system.
- RCuPE Acceleration → Const Acceleration: simulates gravity/adds velocity.
- RCuPE Location → Emitter Initial Location: assign point of origin for particle emitter.
- RcuPE Location → Bone/Socket Location: assign skeletal mesh bone(s) as PoO for PE.
- RcuPE Attraction → Point Gravity: draws all particles to a single point based on strength and falloff radius.
- RcuPE Collision → Collision (Scene Depth): adds collision (does not require struck meshes to have colliders!)
- RCuPE → TypeData → NewRibbonData: similar to TrailRenderer in Unity.
- Particles using lit Materials can be affected by lights, at a significant performance cost.
- CPU particles can emit light, at a significant performance cost. Best practices dictate a separate emitter from the target one with reduced number of particles and low emission percentage.
- RCuPE → TypeData → NewBeamData: similar to LineRenderer in Unity, though significantly more flexible.
- RcuPE → Beam → [options]: modules for modifying Beam Data, of particular interest is the Noise module which can produce oscillating and crackling, lightning bolt-like effects.
- 3DS Max script.
- Practical usage and integration into pipeline required for complete understanding. Seems very useful! Link.
- Cloth assets (as well as destructibles with more than 1 layer of fracture depth) can only be created in APEX PhysX tools.
- Practice with these tools will be essential for going forward with these assets.
Dynamic Scene Shadows
- Cascading Shadow Maps use the Dynamic Shadow Distance from the camera and create a transition from dynamic shadows to static baked shadows. Found in the Details panel of a Directional Light Actor.
- Reflection Capture Actor: an abstract Actor that bakes reflections into other Actor Materials within their radius pre-runtime. 1:1 on Roughness of PBR Materials (a dream come true).
- Box Reflection Capture vs. Sphere Reflection Capture: the shape of the capture, Box is more performant however the Sphere is more flexible.
- Screenspace Reflections: see Post Processing.
- Scene Capture Cube: analog to Unity's dynamic cubemapping. Expensive, should be avoided unless trying to achieve a specific effect (possibly architechtural rendering if other reflection methods are not producing good enough results).
- Scene Capture 2D: analog to Unity's RenderToTexture. Processes every frame, like Scene Capture Cube, should be used sparingly.
- Useful audio console commands (opened with ` while game is running):
- stat soundwaves: list the playing Soundwaves and their volumes.
- stat soundcues: list the playing Soundcues and their volumes.
- stat sounds -debug: sound asset must have 'debug' set to true for attenuation settings to diplay.
- stat sounds off: remove debug stats.
- stat reverb: list active reverb effects.
- Spatialized Audio properties (plays from specific location):
- Allow Apatialization: true.
- Override Attenuation: true.
- Attenuate: false.
- Spatialize: true.
- Attenuated Audio (volume dependent on distance from emitter):
- Override Attenuation: true.
- Radius: full volume within this distance.
- Falloff Distance: range where volume will start decreasing to zero.
- Low-pass Filter Attenuated Audio (sound muffles by distance):
- Override Attenuation: true.
- Attenuate with LPF: true.
- LPFRadius Min: LPF function begins beyond this point.
- LPFRadius Max: Full filter at this limit.
- Note: LPF has no function within 300 units.
- Shaped Emitters:
- Attenuation → Attenuation Overrides → Attenuation Shape.
- Sound Attenuation Settings as Asset:
- Add New to Content Browser → Sounds → Sound Attenuation.
- Set properties on new Sound Attenuation asset.
- Assign asset on Sound Actor's Attenuation → Attenuation Settings.
- Sound Cues are the bread and butter audio building blocks.
- Reverb is handled by an adjustable Audio Volume, like a trigger. Anything inside the volume will be affected by the selected Reverb Effect. Reverb Effects are user-definable assets.
- Audio Volumes can be used to isolate and deaden external ambient sound, serving as a convenient "switch" between interior and exterior noise.
- Sockets are customizable pivot points that can be added to existing meshes and used as Blueprint variables. In the SME open Window → Socket Manager to access them. Once a socket has been defined it can be assigned by selecting the mesh component in the BPE and Details → Sockets → Parent Socket.
- RC on SM_Asset → Asset Actions → Export… - Exported selected static mesh.
- While BSP brushes are certainly convenient for white/greyboxing levels, the tools in 3DS Max provide far more flexibility and establish an asset pipeline early.
- Establishing AI navigation meshes is a matter of surrounding any geometry where movement is desired with a Navmesh Volume.
- It's not clear from the example content how to make this navigable by a player-possessed pawn, say for point-and-click movement, so further research is required.
- For basic AI navigation it's useful to note that the navmesh can dynamically update, and also contains an inbuilt function (Nav Link Proxy) for special traversal cases.
- An interactive example of vector math and functions. Useful reference.
- Level streaming is handled by triggered Blueprint node Load Stream Level. In-game, Level Streaming Volumes are used to mask the loading from view.
- It is possible to play movie assets outside of the loading screen (projected to texture, or as part of a UMG element), targeted to be fully production ready in version 4.8.
- Documentation link.
- Physics-driven animation for characters and Skeletal Meshes.
- A Physics Asset must be applied to the mesh (created in PhAT).
- The effects are driven via Blueprint nodes:
Useful for ragdoll effects, simulating broken bones, and localized physics-driven hits to characters (heads snapping back, legs getting nudged, etc.)
- Set All Bodies Below Simulate Physics: recursively enable physics (ragdoll) on all bones in the chain below the one set in the node.
- Set All Bodies Below Physics Blend Weight: from 0.0 to 1.0, higher values mean more physics drive.
Subsurface Profile Shading Model
- Override for Subsurface Scattering on a Material, and slot a profile in. Provides a more accurate simulation of light distribution in translucent materials.
- Subsurface Profile: asset containing simulation information, consisting of subsurface and falloff colors, and scatter radius. Can be edited interactively.
Multiplayer Shootout Demo (Gerleve & Noon)
- Clock Icon on node means latent (asynchronous) node.
- Look into meaning of BP node icons.
- Spawn all elements important to gameplay on the server. Period.
- Distance Field AO: Preference Engine: Rendering: Generate Mesh Distance Fields (Enable & Restart Editor) | Light: Raytrace
- Sun Rays: Light Source → Light Shaft Occlusion & Light Shaft Bloom (Enabled)
Livestream Mar 12
UMG Unreal Motion Graphics (UI)
- The core of UMG consists of Widgets, or prefab UI constructors, edited within specialized Widget Blueprints:
- Designer: visual layout of interface.
- Graph: widget functionality.
- Custom cursors: forum post here.
Engine Feature Samples
Features Tour 2014
- Museum flythrough of many of the topics covered in Content Examples.
- Contains Blueprint for procedural steel tube railing.
- Rendering security camera Material.
- Rendering post process effect through translucent Material.
- 3 cameras, multiple blended animations on several Characters, visual and audio effects, all with completely exposed Matinee timelines. A must-study for cinematic creation.
- A vast terrain built with World Engine.
- Contains a Blueprinted mini-game: hanglide through the scene and fly through rings to get a speed boost.
Excellent ice Material on frozen lake.
- 3 examples of Material-driven animated water with blended foam effect.
- Includes accompanying explanatory video.
- Excellent example of using lights, material and post processing.
- Includes excellent flashlight Blueprint.
- Includes interesting use of stacked mesh with Bump Offset Material Function to achieve realistic plush effect.
- A potpourri of Unreal features: Matinee, blended Animations, GPU particles, emissive Materials and Functions, Audio, and Landscape.
- Of particular note: emissive Material masking animation on pillar sigil.
- Includes two excellent complex particle effects, P_SparkBurst and P_SprayLeak.
- Includes example of using normal map warping to achieve water-damaged paper (M_Signs).
- Documentation illustrates hierarchical placement of Reflection Capture Actors, a best practice.
- Instanced Static Mesh (Component): Blueprint component for saving rendering time.
Packaging and Cooking Games
- Edit → Project Settings… → Maps & Modes → Default Maps → Game Default Map: set the level you want to load when the game loads; if this is not set you'll get a black screen to start.
- File → Package Project → [PlatformName]: get the ball rolling.
Launching to Devices
Livestream March 17: Building a Tank
- X axis is forward in Unreal.
- A Pawn is anything that can be possessed by a player or AI.
- Characters are useful for bipeds.
- WheeledVehicle (found via search) is for what it says.
- Tank was built with Pawn.
- Static meshes built in Max, sockets added in Unreal.
- All component SMs dragged into Pawn.
- Heirarchy established: body as root, turret parented to body, etc.
- Components easily aligned by selecting Sockets → Parent Socket.
- Some components may require zeroing.
- Sockets connect to pivot points.
- SpringArm is a component to attach following camera to. Parent camera to spring arm & reset transform.
- Settings → World Settings → GameMode Override: Tank_Game, Default Pawn Class = primary Pawn (in this case the tank).
- Event Graph on Pawn to Blueprint it.
- InputAxis Forward node → Add Actor Local Offset → Split struct, wire Axis Value to Delta Location X. Insert a Float multiply with variable to control speed.
- InputAxis Right node → Add Actor Local Rotation → Split struct, wire Axis Value to Delta Rotation Yaw with Float multiply.
- Target child objects to create additional rotations.
In Bindings, Action Mappings are digital inputs. Axis are analog.
- Always use Material instances wherever possible for performance's sake (the shader is only compiled and calculated once).
Video Series - Endless Runner
F8 - jump out of the character control while the game is running.
Rotating Movement: component that can be added to Actors for easy simple automated rotation.
Random Point in Bounding Box: returns a random point in a specified bounding box. Useful for spawning things inside a defined Box Collision.
Move Components in Child BP: now possible in 4.7x
Split Hit Struct Pin: opens a universe of options.
Livestream March 24: Foliage
Procedural foliage generation based on seeding/biomes.
Auto-spawn grass on landscape.
New Procedural Foliage: ProceduralFoliageSpawner.
New Type: FoliageType_InstancedStaticMesh.
- Types: define. Random Seed, Tile Size.
- Drag & drop into scene, adjust Brush Settings sizes.
- Procedural Foliage → Simulate.
- Procedural: Clustering is the meat of the appearance results.
- Steps: Initial Seed Density is 10x10 meters.
- Num Steps 0 = 1 tree by 1 seed.
- Seeds per step = Exponent.
- Distribution seed controls clustering shape.
- Increasing Average Spread Distance can create denser clusters.
- Spawns based on Collision Radius (trunk of tree) and Shade Radius (canopy radius of tree).
- Growth: Trees shouldn't Grow in Shade. Other actors can have this enabled to populate around tree bases.
- Older trees tend toward Max Scale. Scaling within Growth controls overall size distribution of trees.
- Overlap Priority: if there are 2 different seeds competing for the same position, whichever is higher wins.
Video Series: 3rd Person Game With Blueprints
- When importing rigged assets for characters, skeletons can share animation data provided that the core bone heirarchy matches. Extra attaches are a non-issue (like accessories, hair, tails) but any additional bones that disrupt the core heirarchy will require their own animation sequences.
- Animation sequences can be targeted to skeletons on import, so the correct order of operations would be to import skeletons first, animations second.
- Any object filtered by Animation opens in Persona, Unreal's animation editor.
- Skeleton Mode: useful for adding sockets to bones. Asset Browser may not be visible by default, can be opened from Window → Asset Browser and docked.
- Mesh: useful for modifying the model itself (Materials, Clothing, Additional Meshes). Also provides access to Morph Target Preview.
- Animation: detailed access to animation timelines.
- Notifies: enable effects generation at frames, for example footstep sounds keyed for the exact moment when a character's feet hit the floor, or explosive particles for when a character's weapon strikes something.
- Curves: for driving variables that can change the behavior of Materials or other objects.
- Blend Space: method of blending between multiple looping animations based on a property.
- Can be 1 or 2 dimensional (think idle → walk → run, and the same with directional modifiers).
- Created like any new Asset in the Content Browser: RC → Animation → Blend Space (1D).
- Requires a Skeleton.
- 1D Blend Spaces can be displayed vertically, allowing for easier readability.
- Blend Space Parameters (1D): Label, Range, Divisions.
- Once the axis has been defined, simply drag and drop animation clips onto it at points in the range where they should be fully active. All points between them will become incremential blended animations.
- Animation Blueprint: BP for driving animations.
- RC in CB → Animation → Animation Blueprint: create new BP, AnimInstance is the most common Parent Class. Requires skeleton.
- Adds the Graph tab to the Persona editor for the asset.
- RC in AnimGraph → New State Machine: create new State Machine.
- DC new State Machine: expand the state machine.
- State Machine: container for animations during a given state.
- Drag Wire → Add State…: create a new state.
- DC new State: expand new state.
- Inside the state, drag and drop animation clips or blend spaces, then wire them to the output. In the case of blend spaces, here define the variables that drive the position in the space.
- After wiring additional states, a Transition Rule icon will appear over the wire. DC this to jump into its editor and define when the transition occurs.
- Time Remaining (Ratio) node: used to determine how much of an animation is left, 0 is the beginning and 1 is the end.
- MO Transition Rule: display a tooltip showing the rule.
- Vector Length of Velocity: magnitude (speed).
- RC in CB → Blueprint Class → Character: create a new pawn with built-in abilities to walk, run, jump, fly, and swim (humanoid locomotion)
- Includes Capsule Component, Arrow, empty Mesh, and CharacterMovement (all variables for locomotive control).
- Define a mesh, and assign the Animation Blueprint (above) to Animation → Anim Blueprint Generated Class.
- Axis Events vs Values: an Event is when you want to make something happen, a Value is when you want to query what the current value of the axis is. Set vs Get, in other words.
- Add Movement Input: moves the movement component in defined direction, modulated by a scale value.
- Control Rotation: current rotation of target pawn. Yaw controls character facing.
- Game Mode BP: the master setup configuration for the game: the type of pawn the player uses, what type of camera, the type of HUD (if any), etc.
- Settings → World Settings → Game Mode → GameMode Override: useful for setting the current Game Mode BP that executes when Play/Simulate in editor.
- RC in CB → Animation → Animation Montage: method for combining animations and selectively playing them, triggered by events. Montage can handle root motion (collision snaps to skeleton). Requires skeleton.
- Opening Persona via Character's Anim Blueprint tends to give the widest array of tools from the editor.
- Persona → Animation Tab → Skeleton Tree → Show Retargeting Options: If an animation clip requires retargeting (different source/target scales, for example).
- Quick retarget: RC Root → Recursively Set Translation Targeting Skeleton.
- Pelvis drop-down → Animation Scaled.
- Root drop-down → Animation.
- Drag-and-drop animation clips onto the Montage timeline to insert them in sequence.
- RC top of Montage sequencer: add new Section.
- RC Montage Section name: options.
- MW on Montage sequencer: zoom in/out.
- RC and drag on Montage sequencer: scroll left-right.
- Sections → Clear: create clean Sections palette.
- Click a Section in the Section sequencer, then select another Section from the palette above to add it to the end. Repeat selection will create a loop.
- Montage Play node: wire from custom event to call a montage.
- Emitter on Notify: create a positional emitter for particles or audio during animation clips.
- RC on target bone → Add Socket: create position for emitter.
- RC on animation timeline at trigger position → Add Notify… → PlayParticleEffect.
- Select Notify → Details → Anim Notify: set parameters here.
- Attached: does the particle stick to the socket or is it left behind after triggering?
- Quick Particle System Refresher:
- All Deselected → Particle System → Warmup Time: how long it takes to spawn system, 0 is instantaneous.
- Spawn Selected → Spawn → Rate → Distribution → Constant: 0, Process Spawn Rate: Disabled, Burst → Process Burst List: Enabled, Burst List → Add Element: enables burst spawn vs. constant spawn, defined by Count.
- Required → Duration → 1: spawn a single burst.
- Required → Emitter → Kill on Completed: Enabled: clean up the system after spawn.
- Lifetime → Lifetime → Distribution → Min/Max: how long the particles live.
- Color Over Life → Distribution → Constant: can be used to manually tweak the color.
- Color Over Life → Curve: quick visual edit of alpha value over time, in Curve Editor.
- Initial Velocity → Start Velocity → Distribution → Max/Min: shape of the emitter.
- Initial Size → Start Size → Distribution → Max/Min: size of the emitter.
- Drag (may need to RC under the Emitter to add): dampening (flattening).
Video Series: Introduction to Materials
- Clean Up: remove unconnected nodes.
Setting Up Unreal Engine Repository With Sourcetree, git, & GitHub on Windows 8.1
- Set Up Git.
- Set Up SourceTree.
- Open SourceTree & Clone/New → Create New Repository → Repository Type: Git, Destination Path: your Unreal project folder, Name: as you like, Folder: [Root] → Create.
- Select files to commit from Unstaged files. It may be easier to visualize/select from Tree view mode.
- If you're just sharing the bare bones, you'll need to commit the Content folder & the .uproject file. You won't need anything from Config, Intermediate, or Saved.
- Click the main Commit button (next to Clone/New in the top left), make a detailed comment about the contents, then click the Commit button in the lower right of the comment window.
- Create a Repository on your GitHub.
- GitHub will supply a list of commands you need to execute on your local machine. Copy the 1st, which should start: git remote add origin URL
- In SourceTree, click the Terminal icon (top right).
- Right-click on the Terminal window & Edit → Paste, then press enter. You'll be prompted to enter your GitHub security credentials; do so.
- Repeat for the 2nd command, git push -u origin --all
- I'm not sure if this command is necessary, GitHub doesn't supply it but the guy in the video did it, & I did it without error, so go ahead & enter this too: git push -u origin --tags
- At this point the repository should be mirrored on GitHub, click through your account to confirm.
- Changes made to existing content will show up with a yellow "..." icon & number under the repository name in SourceTree. These files can then be selected in the Unstaged files section, moving them into the Staged files section.
- Click the main Commit button, enter a detailed comment, enable Push changes immediately to origin/master, click the Commit button in the lower right of the content window, enter security credentials, & you're off to the races.
Creating Basic UMG UI
- Set up the UMG visual stuff.
- Bind the element to change via gameplay.
- In the Binding, Get Player Pawn/Character, cast to the player Blueprint, wire off the As… pin with a Get to the variable you want to link to the UI element. Wire off the variable to the Return Value on the ReturnNode.
- In a GameState BP, pull a Create Widget node from Event Begin Play or wherever the UI needs to be displayed.
- Set a variable for it for ease of use later.
Wire an Add to Viewport node from the Set.
- You're in business.
Livestream April 22: Audio and Blueprints
- Audio Component allows for much wider array of control than PlaySoundAtLocation, which is essentially a "fire and forget" audio event.
- Chain Fade In and Fade Out nodes to create crossfades. As an example, a looping sound can be triggered with a Play node on input, then the crossfade chain triggered on input release.
- Set Pitch Multiplier: increase or decrease the pitch of an Audio Component.
- Set Volume Multiplier: increase or decrease the volume of an Audio Component.
- Shift+LC on Timeline: add keyframe at location.
- RC on keyframe: modify keyframe type.
- Adjust Volume node: allows for audio volume changes controlled by both Duration and Level.
Easy Dynamic Audio
- Split tracks and export .WAV from DAW.
- Import into Unreal.
- Construct Cue with Looping Wave Player nodes for each track.
- Wire each LWP node to Continuous Modulator, assigning Volume Modulation Parameter names for each.
- Wire all into Mixer node, then to Output.