The Fluid Flux is a powerful water system based on 2D shallow-water fluid simulations.


  • Realtime shallow water simulation – fluid data modifiers, wave generator, and extendable interface
  • Fluid surface rendering – caustics, wetness, underwater, waterline, advected foam, advected waves, blending with the ocean, dynamic audio detection
  • Fluid Interaction – simple cheap ripple solver moving with character, optimized to an absolute minimum
  • Ocean wave blending  – rendering tillable ocean heightmap texture in a single pass
  • Niagara environment interaction – High-quality effects, bouncy,  plants,  character swimming, boats, 
  • Clean, efficient, GPU-friendly implementation, interface designed with the KISS (Keep It Simple, Stupid) rule in mind
  • Small, compact, and low memory footprint
  • Tool for generating ultra-fast static meshes with flow maps baked into vertex color.
  • Advanced fluid state management, loading state in gameplay.
  • Niagara fluid async readback system for sampling height and flow of fluid in blueprints.
  • Dynamic audio analyzer. The sound source is positioned based on fluid movement.
  • Four example maps – beach, island, river, and baked static river
  • Velocity-based fluid flow advection method for foam caustics and waves


With great power comes great responsibility. I’m trying to be a reliable marketplace creator so it is important for me to be clear about the limitations and disadvantages of using simulations. The Fluid Flux system is overhyped so please read the description below before you buy this product and make sure that everything meets your expectations and requirements:

  • In general, I can’t solve every possible water problem in every possible type of project. The Fluid Flux is a huge step forward for the game industry but also makes everything harder. Simulations can be hard to control and tweak, sometimes a single parameter can change the flow of fluid on an entire map.
  • The Fluid Flux simulation is based on the Shallow Water Equations (SWE) solver, the algorithm was published by Matthias Müller in “Real-time Simulation of Large Bodies of Water with Small Scale Details”. Simulation is calculated on heightfield mesh which means all obstacles are rendered to heightmap with top-down projection. Fluid can’t be simulated in caves and multilayered.  
  • Scalability is a real problem. The simulation requires allocating floating-point render targets so the maximum recommended resolution of texture is 1024×1024. That means if 1 pixel represents 100 cm in real-world (very low quality) then your simulation area can cover approximately 1km x 1km square with low detailed water. It’s not that much but still useful. The simulation frame can be baked to static mesh and used as mesh placed on the level.
  • Multiplayer is not supported for dynamic simulation because synchronization of render targets is limited. However, the system can be used in a multiplayer game as long as your gameplay is not dependent on fluid simulation results.
  • The geometry of fluid is rendered using a static mesh plane displaced by fluid height. The system uses a huge plane (1024×1024) for rendering the water surface without any dynamic tesselation. This method has a lot of quality flaws.
  • Statically generated LOD mesh can be inconsistent with waterline post-process this issue will be addressed in the future.
  • Water Plugin is not supported in the current version. It is possible to use my water material with the water plugin mesh but it is not officially supported.
  • Niagara fluid readback system returns results with one frame delay. It’s good enough for most features like swimming water detection etc.
  • My support time is very limited. It’s an early version of this product, be aware that there will be multiple updates that will improve the quality and efficiency. The project is enormously big and complicated, simulations can be unpredictable – sorry for all the problems and bugs. Im ready to help in case of problems contact me on Discord or by e-mail.
  • The documentation is incomplete in some areas yet. I was pushed to release as soon as possible and could not have enough time to build simple examples. The Fluid Flux system is clean and elegant but blueprint reading skill, and example code analysis is required to use.
  • The axis-aligned rectangular volume of simulation is only supported which means it can’t be rotated
  • The simulation area can’t be moved in runtime. Movable volume is one of the most important features for future updates.
  • Trace hits with dynamic fluids are not supported now. 
  • The system also does not support the wave break effect because this approximation of fluid does not contain the needed data to render it.
  • Underwater glass, holes, and the submarine view are not supported.

Good and bad practices

  • Try to avoid using huge simulation resolution because you will run out of memory very fast, 1024×1024 seems like a good compromise for now.
  • Don’t make your gameplay rely on fluid simulation in multiplayer because it will not be synchronized.
  • Avoid using Fluid Flux on flat surfaces, subtle slopes are always better.
  • Avoid hard-edged geometry (boxes) it can be approximated wrongly on heightmap and sometimes look terrible
  • Don’t overestimate possibilities if I was not prepared ocean with ships and a huge island covered by rivers and simulated lakes then there is probably some reason for that.
  • Try to not update the simulation ground in every frame it can cost you a lot of performance.

Questions & Answers

  1. Unreal Engine 5 support?
    The Fluid Flux was created and tested on Unreal Engine 4. The Unreal Engine 5 is in an early stage of development, and unstable, the pack can be used with this version of the engine but I can’t support engine-related bugs and all differences between UE4 and UE5. Please read the “Known issues” chapter to make sure that you are aware of the most common issues that you may encounter.
  2. Why is the price so high? Could you do some promotion?
    • Creating a Fluid Flux pack took me over 16+ months of work during nights, I was doing it after a regular job. Normally this kind of work would cost the company at least $150 000 so it is a huge shortcut for everyone.
    • The subscription for all future updates and support is included in the price. This kind of system requires a lot of support. The engine updates make it even harder. It is not the only product on my marketplace profile so my time is very limited right now. I quit my job to support this product and I will probably have to hire someone for help.
    • The product is a combination of multiple big features that are designed to work together and thanks to that your life can be easier. You don’t have to buy multiple products and work on merging ripple simulation, fluid simulation, water surface, ocean waves, buoyancy, waterline, post-process, and swimming. I did it for you.
    • Take a look at my marketplace profile and make sure that I am a trustworthy creator with hundreds of positive reviews. The price is always calculated on the real work and time that I need to put into my products. Someday you will understand that time is a real currency that is worth something in this world.
    •  I will start doing some promotional sales when everything will get stable and I will find myself ready to support new customers. If you are not in rush then just add my product to the wishlist and be patient. If you want to be notified a few days earlier about a planned promotional sale then join my Discord server 🙂
    • I’m not forcing anyone to buy my product. There are free alternatives like Niagara Fluids, and Water Plugin you can try them and compare results with my demo, maybe you don’t need the Fluid Flux.  
  3. Can it be used in my open-world game?
    The open-world setup was not tested yet. The demo examples are showing how the system can be used. I can’t promise anything more at this point of development. Just assume that you get what you see on my videos no more no less.
  4. Does it support multiplayer/replication?
    Unfortunately, fluid simulation is not replicated. A huge render target is generated in every frame and cant be synchronized so if there would be some differences between clients there is no way to correct them. The package can be used in multiplayer games for visual addition.
  5. External packs integrations ALS/UDS/Fluid Ninja 
    I am planning to work on some integrations in the future. If you have some specific pack in mind let me know.
  6. Can I use it with Voxel Plugin?
    Yes, you can use Fluid Flux with Voxel Plugin but it is also limited to heightfield projection so caves and planets are not supported.

    Use Simulation.RuntimeCaptureDelay = 1.0 to force simulation to wait for the voxel plugin map to generate mesh before rendering it to the height map.
    The heightmap in this presentation is updated every frame using CaptureGroundHeightmap. It is not very efficient and should be probably optimized to only update the area around the brush when something changes on the map. The UpdateGroundMap(Position, Size) function would be a better choice in this case.
  7. What about VR and mobile?
    The VR devices and mobile are currently not supported by me.
    Dynamic fluid simulations with high precision can’t be calculated on mobile so only baked meshes/states can be used on this platform. I am planning to add VR and mobile support in future updates. The pack will receive a cheap surface material mode similar to the solution presented in Aquatic Surface. One user notified me that my demo maps are working without issues on Oculus Quest 2 but I had no chance to check it yet so I can’t confirm it.
  8. How to add swimming to ALS? Can you show me?
    This task is on your side. The Fluid Flux BP_FluxDataComponent) can give you all data that you need to implement swimming. I’ve also prepared an example swimming implementation (BP_FluxSwimmingComponent) for testing.
  9. Should I change to Fluid Flux? Is it better than  WaterPlugin/Oceanology or Aquatic Surface?
    It depends on what requirements you have. The Fluid Flux is good at creating a dynamic river simulation on heightfields, and interactive scenes but does not scale very well so if you need just a background ocean then you should not bother using simulations. Play the demo of the Fluid Flux and decide if this is what you want.
  10. Why blueprints? It is slow because of that?
    This system is fully implemented in blueprints but relays mainly on GPU (shaders/render targets). The cost of c++ implementation would be similar in typical conditions. In the future versions, the cost of blueprints will be reduced by Niagara implementations so it will be even better. User interface and lack of tessellated fluid surface is a bigger problem with blueprints but I will also work on solving that.
  11. Is fluid flux calculated deterministically?
    If you run a simulation on the same machine it is deterministic (constant delta time is used) but there are probably differences between floating-point operations on different GPUs so it can’t be fully deterministic. In multiplayer the biggest problem is the synchronization of state when someone joins the game and when the player loses some frames because of spikes and the simulation can’t work it off – I’ve decided to limit the number of accumulated iterations per frame in this case.
  12. Do I need this pack? Other water systems like your Aquatic Surface can do the same.
    Well, if you can’t see the difference then probably you don’t need the Fluid Flux. I’ve prepared a simple comparison of  Fluid Flux and Aquatic Surface that visualizes the difference:

    That does not mean Aquatic Surface is bad. It’s a great product with a completely different feature list. You have to choose a product that fits your requirements.

  13. What about the future of this project?
    I’m planning a lot of updates, optimizations, and moving implementation to Niagara.

Reporting Bugs

Code that doesn’t exist is the code you don’t need to debug. I am trying to do my best and solve all possible bugs or find good workarounds but there is always something that can be broken it’s like a typical lifetime of applications nowadays. 

Simulations can be unpredictable some edge cases are probably still not handled. In case of finding some specific bug, you can rep[ort it to me and I will take a look at it in a few days. Before sending the report:

  • make sure that you use the newest version of my pack.
  • prepare a detailed explanation of repro steps needed to recreate your bug, and make sure that the explanation is as clear as possible.
  • information about the engine version you use, and your development platform.
  • create a minimal example project that can show me the bug (it will increase the chance of fixing it)
  • prepare a video or screenshots presenting the problem.

Now you can report this bug by sending an e-mail at:

Known issues

  1. Meshes with dithering enabled cant to be cached by ground capture. There is a workaround for this problem presented in the M_Photoscan_Master material.
  2. Currently, known issues that exist only on UE5:
    • In some configurations of UE5 reflections can flicker on rivers. It’s somehow related to reflection capture actor but I was not able to find the reason why. Removing box capture helped once but some users still experience this bug. If you will find the answer then please let me know.
    • The audio parameters are not working on UE5 (will be fixed in 5.0.2) so Fluid Flux uses a workaround that is switching between under fluid and over water fluid waves instead of soft blending.
    • Disappearing palm trees. This bug exists only in UE5 and will be fixed in version 1.2. Open MI_PalmTree and modify BranchDirectionScale = (1.0, 1.0, 0.9, 1.0)  

  3. Plants will not load properly on the map if you are not in “Real-Time” render mode in the editor. Its limitation of blueprints I don’t have any event that could execute afterload and adjust Niagara effects.
  4. “Real-Time” render mode is also required to simulate in the editor. Otherwise, the view will not refresh after each frame.

Planned improvements

Current status of improvements can be followed on treello.

  • Surface mesh – better water surface. Currently, it uses dense planar mesh but in the final stage, the mesh will be dynamically tessellated.
  • Scalability – Large-scale simulations locally updated with the moving areas are in my plans.
  • Niagara fluids – the system will be rewritten to Niagara fluids when it will be stable and ready for this.
  • Audio detection – the current system uses an ambient audio source and a simple dynamic source I am planning to improve to three dynamic sources of audio that works at different distance.
  • Simulation volume – movable simulation area and working with multiple simulations a the same time.
  • Alternative material modes – path tracing and stylized water


Getting started

This section covers the fundamentals of Fluid Flux and its tools. If you are new to Unreal Engine, you should become familiar with the Unreal Editor interface, Blueprint visual scripting, and the types of content. Working according to the documentation can provide the best possible experience with this product. 

Project configuration

The Fluid Flux pack requires enabling a few basic plugins and TPP input config for testing the character. The package is an asset pack so it can be downloaded directly to the project. The package can be added to any project using Epic Launcher but users also can use an empty project compatible with UE4.26 configurated by me that can be downloaded from the link:  FluidFlux_Template.

A list of plugins that needs to be enabled can be found below:

  1. Editor Scripting Utilities
  2. Niagara
  3. Procedural mesh Component
  4. Asset Tags (optional)

The first place to visit in the demo folder is Demo/Maps. This folder contains all example maps that are working perfectly with the TPP example character (BP_DemoCharacter). If the project uses another character template then probably input should be changed to make it work. Input config can be downloaded from this link: DefaultInput

Make sure that DBuffer is enabled in your project settings because it’s required for proper decal (wetness and caustics) rendering. If you don’t want to use those effects then you can disable decal material in the surface actor.

Project structure

The Fluid Flux project is organized in a certain way, this short description will bring you closer to what kind of content can be found in each folder.

  • Demo –  The demo is the most important folder for new users. The demo examples present how to use this pack, how effects can be achieved, and how to integrate systems with character. Everything that can’t be found in this documentation probably will be presented in a very clean way in the demo folder. 
  • Editor – editor-related tools, icons, utilities, procedural mesh generator
  • Simulation – Shallow water simulation actor and tools for controlling fluids and generating state.
  • Interaction – Simple system of interactions that adds detailed lightweight ripple fluid simulations.
  • Surface – Presenting the simulated fluid renders surface, underwater volume, post-process, caustics, and audio and combines everything together.
  • Environment – Niagara particle systems that allow readback pieces of information from simulation implementing swimming, buoyancy, and drive particles using fluid state.
  • Waves – a system designed for generating ocean waves that can be used in the background and mixed with fluid simulation. Before starting work with the pack it’s worth being familiar with the structure of the pack and the systems that it provides. 

Updating process

The current newest version is described on the marketplace page and on top of the documentation. Frequent updates of Fluid Flux are planned so always make sure that the Fluid Flux package version is up to date but be aware of potential problems when updating.  

Changes that can be made during updates:

  • fixing bugs
  • adding features and examples
  • removing/renaming files
  • improving quality and exposing parameters

I am always trying to minimalize the damage after the update however it is not an easy task when an update system on the marketplace supports only adding or replacing files. There are a few basic rules worth noticing before updating:

  • It’s better to have some backup. Copy your version of the pack before doing the update,  
  • Remove the pack from the project before updating it. This way you will get a clean, and fresh version of the pack without any ghost files. 
  • Do not modify the pack on your own. If you need modifications then you can inherit classes/materials and override functions. Store child classes and custom material instances outside the Fluid Flux folder.
  • If you need some small trivial modification that could improve usability then you can let me know we will figure out some solution maybe put it into the next update.

Simulation system

The BP_FluxSimulation blueprint is the heart of the Fluid Flux system. Basically, this blueprint handles  important tasks like:

  • Rendering of the ground heightmap to texture.
  • Updating simulation of shallow water fluid, foam, and wetness.
  • Baking and exporting simulation state.
  • Sending data to the fluid surface renderer.

The Shallow Water simulation is based on the idea of assuming linear vertical pressure profiles so it is simulated in two dimensions. In general, the algorithm can be described in a few steps: 

  1. Simulation data is stored on 2D render targets.
    Ground map – information about landscape and obstacles
    Velocity (RG) Depth(B) Foam(A) map – stores information about fluid
    Height(R) Wetness(G)  map – stores surface height and wetness of the surface
  2. The slope of the ground heightfield and slope of fluid is combined and used for calculating the pressure and velocity.
  3. Simulation is an interactive process of updating fluid height and velocity.
  4. The result of integration is used for foam and velocity advection.
  5. Fluid modifiers can be used as input for simulation to change the current state.
  6. The simulation frame is used for accumulating fluid wetness and generating the fluid surface mesh displacement.


The BP_FluxSimulation actor placed on the map is doing nothing because it’s empty and needs to be filled with fluid. It is the simplest way to fill containers with fluid is by adding a Modifier source actor.

I the “Pixel Depth Offset” feature is used in the material then it may be not rendered to a ground map and cause some simulation problems. In this case, the best option is to use MF_FluxPixelDepthOffset material node that will automatically disable “PixelDepthOffset” during the process of caching the ground map. The M_Photoscan_Master material is an example of a workaround for this problem with the use of the MF_FluxPixelDepthOffset node.

Real-time simulation

Before starting simulating the fluid on your level make sure that Real-time rendering is active in your viewport options.



The modifier component affects the simulation state and changes the current simulation state. It’s the simplest way to control simulation. Modifiers can be used for:

  • Adding fluid to simulation container.
  • Removing fluid simulation container.
  • The changing velocity of the fluid in some areas.
  • Simulating interaction with fluid
  • Generating waves
  • Control fluid flow and volume

The Fluid Flux contains predefined modifiers:

  • BP_FluxModifierComponent – Base parent class for all fluid modifiers.
  • BP_FluxModifierSourceComponent – Simple modifier that allows adding/removing fluid and changing velocity in a specific area. 
  • BP_FluxModifierGerstnerComponent – Generates Gerstner waves.
  • BP_FluxModifierInteractionComponent – Generates interaction with fluid. Can be added to the character.

Every modifier component extends the basic BP_FluxModifierComponent and implements specific behavior. Users can create new modifier classes with specific behaviors like whirpools/waves. The BP_FluxModifierSourceComponent component is a good simple example of a modifier that can be used as a starting point for learning the system.

The modifier container is a special type of actor that can store multiple modifiers and send them to simulation. If the actor implements BPI_FluxModifierContainer interface then it’s considered a modifier container

BPI_FluxModifierContainer interface can be implemented by any actor. A good example is BP_DemoCharacter that uses BP_FluxModifierInteractionComponent to interact with a fluid. AddModiffiers function only needs to be implemented to make it work.

BP_FluxModifierContainerActor is a basic container that can combine multiple modifier components in a single actor that works simultaneously.

BP_FluxModifierSourceActor is a specific type of Container actor that simplifies the process of adding simple source modifiers to the scene.

Every modifier is rendered in an additional render target pass, so it’s not cheap. It’s good practice to minimalize the number of fluid modifiers inside the simulation for achieving the best possible performance.

Known issues

Sometimes simulation can behave unexpectedly because of stability loss on high slopes. It’s a well-known issue, it can be fixed by adjusting a few parameters and making the simulation more reliable.

Simulation spikes are caused by the instability of simulation in some conditions.
  • Slope Scale (Increase)
    Increasing this parameter will scale down the height of the scene on Z-axis and make your fluid move slower on slopes. (This parameter will be probably redesigned in the future version for better consistency with the world scale measurements).
  • Simulation Delta Time (Decrease)
    Fluid flux is updating with constant delay time. That means the time that elapsed between two frames of the game is divided by delta time and the result is the number of iterations. In general, more iterations mean less performance. By default, the Simulation Delta Time attribute is set to 0.2, it is not safe in terms of accuracy but can work with the highest performance.
  • Debug Preview / Debug Hidden In-Game (Debug)
    Pay attention to the ground debug preview. If you will spot some red spikes in places where simulation does not work properly then probably the problem of your ground texture and geometry should be adjusted.
  • Try to tweak other variables like Slope Clamp, Velocity Clamp, Friction, Damping, Gravity, and even World Pixel Scale can make difference.

I am still experimenting different solutions for this spikes problem and trying to figure out some method to force stability even in bad conditions. You can expect multiple improvements in the future updates.

Ground heightmap

The simulation actor needs to capture the ground to recognize the environment and find slopes for fluid movement calculation. The simulation actor contains settings that allow configuring which objects should be rendered to the ground heightfield. 

The Debug preview option is the most important feature here, it is showing a heightmap that will be used for simulating water on the scene.

It’s noticeable that not every detail will be captured in-ground map probably and the accuracy depends on the simulation resolution scale. The shader used for rendering this preview is coloring lines based on slope and exposes the discontinuity of the ground. 

  • Green lines – easy to simulate very stable fluid
  • Blue lines – Good slope for moving fluid in some direction
  • Red lines – can cause some instability if fluid flows on it but good for walls
  • White lines – there may be some cave that can expose holes in the fluid mesh. 
  • Fully red polygons are representing holes and discontinuity in the mesh. It should be eliminated if fluid can flow into this area.

If you will notice that some objects should not be rendered to heightmap you can exclude them and if something is not rendering you can start investigating what is going on without simulating the fluid.

Meshes with enabled dithering can’t be rendered during ground capturing process. There is a workaround for this problem presented in the M_Photoscan_Master material. MF_FluxPixelDepthOffset node is used for disabling dithering on objects that are rendered in top-down flat projection.

Sometimes picking “Hidden Actors” one by one to exclude them from heightfield can be a waste of time. There is an additional tag called “FluxHide” that can be used by actors to hide them automatically.

Dynamic ground

The Fluid Flux uses a static ground map for simulating fluids on it but updating is also allowed from time to time. It’s not a very cheap operation so it should not be done every frame.

The BP_BreakableDam presented in the video can be destroyed when the player hits the trigger. It is a good example of updating the ground after the dynamic object. Take a look at the Break function which is the heart of the dam system.

It evaluates UpdatGroundMap on a simulation actor, which takes the position and the size of the object that will disappear to recreate the ground in this area. You can do the same operation when adding something on your level.

Ground visibility

The ground scene capture component renders the current scene to heightmap texture. Sometimes we need to exclude some actors that should not be fluid blockers. In the Simulation:Ground:Scene tab you can find multiple options that may be helpful to determine meshes should that be visible in the heightmap:

  1. Use hidden actors for excluding by mesh
  2. Use hidden class for excluding by actor class
  3. Use “FluxHidetag in actors that should not be rendered.
  4. Use M_PhantomMesh material to render mesh ONLY during processing ground (invisible in the world) good for imitating soft slopes of waterfalls or just creating an alternative version of ground when the environment is more complicated (like indoor meshes). 


The BP_FluxSimulation blueprint comes with a simple editor that allows simulating the state of the fluid on the map in the editor mode. Simulation can be prepared in the editor baked to state or dynamically updated by the simulation blueprint.

Fluid state

The PDA_FluxSimulationState is a special data asset created especially for storing the current frame of the simulation.  The simulation state is the most important structure in the system because: 

  • It’s dynamically updated by the simulation actor
  • Can be saved to a data asset file
  • Can be loaded in runtime
  • Can be used as starting point for simulation
  • It delivers data that can be rendered by the surface actor

    Example river state and baked textures.

Pay attention to simulation resolution! The “Power of Two” rule is a fundamental necessity due to the way game engines work. You will not be able to export simulation state if it is size is not power of two (128/256/512/1024 etc.)

A short tutorial about generating the fluid states:

The state use case is described below:

  1. A dynamic simulation state is automatically created in BP_FluxSimulation.CurrentState actor constructor script. Data stored in it can be easily previewed.

  2. If BP_FluxSimulation.SurfaceActorReference is set then actors are communicating together and the state is automatically passed to BP_FluxSurface.SimulationState 

  3. BP_FluxSimulation.CurrentState can be baked to an asset by clicking the right button on  BP_FluxSimulation and choosing Scripted Actions -> Flux Export Simulation.

  4. The Baked state can be used as BP_FluxSimulation.InitialState in simulation actor

  5. Baked state can be also used BP_FluxSurface.SimulationState after removing/disconnecting the Simulation actor.

Fluid surface

The BP_FluxSurface is responsible for an audiovisual representation of the simulation state data. The Surface actor supports a list of advanced subsystems:

Actor component   Features 
  • caustics decal
  • surface wetness decal
  • surface mesh
  • surface scattering
  • surface foam
  • surface detail advection
  • underwater postprocess
  • underwater  waterline
  • underwater surface
  • Generating procedural static mesh
  • Analyzing fluid surface for audio source
Volume Absorption, Volume Scattering
  • Rendering underwater fog and absorption volume

Mesh generator

In the tutorial below, we will go through a useful workflow for generating static meshes based on a baked state of the simulation. With this method you can achieve the highest performance, unfortunately, the waterline effect will not correlate very well with static mesh geometry.

  • Setting up LOD and padding
  • Converting a generated mesh to a static mesh asset
  • Preparing material for static meshes
  • Using static mesh in Surface actor

Mesh generation tools can be found in the BP_FluxSurface – Procedural Mesh tab.

Switch GenerateProceduralMeshView = true to see a preview of the generated mesh. After choosing this option surface automatically generates static mesh based on the Procedural Mesh tab configuration.

The last step is selecting SurfaceProceduralMesh Component and clicking “Create StaticMesh” which will export the mesh to the static mesh asset.

If SurfaceProceduralMesh is not visible on the list of BP_FluidSurface components then you have to deselect the surface and select it again. It’s working like that because the engine does not refresh the list of components after switching on/off generation mode. 

Now you can find your static mesh in content browser and preview it.

The river is simulated on the map and then converted to static mesh. The velocity flow map still works, mesh can be directly used by the surface actor.

Saving static mesh assets with dynamic material is impossible that is why you will have to clean the material slot in the newly created mesh before saving it. You can also set “static” type of material (MI_River_SurfaceOverStatic .UseFluxState = false) that will allow to preview mesh in editor.

Fluid data like foam and velocity are encoded in vertex color which is why they can use simplified material that does need to sample render targets. MI_River_SurfaceOverStatic is an example configuration of material prepared for static meshes. MI_River_SurfaceOverStatic .UseFluxState = false forces system to read data from vertex color. 

BP_FluxSurface can render static meshes using SurfaceMeshMode=StaticMesh which means the mesh will not be transformed by simulation state scale and geometry will be taken from SurfaceOverMesh component configuration.

Async readback

The Fluid Fluix is using Niagara asynchronous readback events to read data from fluid render targets and pass them to blueprints. The BP_FluxDataComponent is a listener that can receive, update and store fluid data at a certain location.

The Fluid Flux uses this feature in multiple situations:

  • buoyancy and floating objects
  • automatic dam breaking 
  • interaction detection
  • swimming system
  • fluid sound source analyzer
  • underwater camera detection

Add BP_FluxDataComponent to your actor and it will automatically detect the fluid surface under the actor. BP_FluxRotatorActor is a good simple example of an actor that can react to fluid.

Interaction system

The BP_FluxInteractionCapture is a system designed for adding detailed ripple simulation on the fluid surfaces as a result of the interaction.

  • It’s a perfect addition for cheap prebaked static simulations that can improve quality almost for free.
  • It is currently supporting the simplest fast ripple solver but pressure fluid solver it was designed to extend with other types of solvers

An example of the interaction system configuration on the actor side is presented in the BP_DemoCharacter. The implementation is based on three elements:

  1. BP_FluxDataComponent – This component is reading the fluid data like height and velocity that can are needed during further calculations of interaction.
  2. BP_FluxInteractionComponent – This component is storing a list of interaction sources. The interaction source is a sphere attached to a component (or skeletal mesh bone) that will generate waves when interacting with the fluid.
    Configuration of interaction surce atached to l_foot bone. System uses owner skeletal component with tag “FluxInteractionOwner”.
  3. By default, interaction sources are attached to skeletal mesh with the tag specified in OwnerComponentTag attribute so it’s important to add this tag (FluxInteractionOwner) to skeletal mesh that will move the interaction sources.

  4. BPI_FluxInteraction -The interface that takes care of communication between actor and interaction capture system. When BP_FluxInteractionCapture is an overlapping interactive actor then it’s calling this GetInteractions function to get pieces of information about interactions that occurred. 
    Simple implementation of GetInteractions function from the BPI_FluxInteraction interface.

Niagara integration

The Niagara integration is based on three elements:

  1. BP_FluxNiagaraActor communicates with the simulation and data to the Niagara system.
  2. NE_FluxData emitter that should be inherited by the Niagara system to read the data.
  3. NMS_FluxData special module that extracts simulation data to Stage Transients variables that can be used to drive the particles.

All particle systems (trash, plants, splashes) in the pack are constructed the same way, fill free to check the examples and modify them.

Ocean waves

The BP_FluxOceanWave is a dedicated actor for simulating ocean waves. Implementation is much simpler than the classical analytical approaches like Gertner or FFT. The system uses multiple tillable textures and combines them into a heightmap that can be used in the post-process/surface/Niagara system to represent wave displacement.

Rexture generated by BP_FluxWaveTexture actor.

The fluid surface uses data from BP_FluxOceanWave placed on the map pinned to the BP_FluxSurfaceActor variable. The blending between simulations of ocean waves can be configured in special cases like on the beach map.

The beach example contains ocean waves blended with fluid simulation. All materials configurated for use with oceans are declared in the Surface/Templates/Ocean/  folder. Surface material uses UseFluxOcean=true flag to activate ocean wave blending.

Ocean waves in the simulation are generated using the modifier  BP_FluxModifierWaveActor that is adding fluid and velocity at the borders of the simulation area.

Additional waves in the background with simplified material (SurfaceDistantMaterial) are attached to the surface in the “DistantMeshes“.

Ocean scene setup

Preparing an ocean scene is an advanced topic because it requires multiple actors working together at the same time. It’s good to try all other tutorials before starting working on recreating the ocean scene.  

There is no video tutorial for ocean waves yet because I am planning a lot of improvements on this topic (infinity ocean surface already implemented) that will be published soon in version 1.2 of Fluid Flux. The tutorial would out of date very fast so I decided to make it after the release.

Examples of setup are presented in the demo maps FluxIslandMap and FluxBeachMap you can copy this setup (Surface, Simulation, WaveTexture, WaveModifier) or create it from scratch on your map. Recreating the ocean scene and detailed waves setup is described below:

  1. Add wave actor (set height of surface, and borders)
    Use RenderDebugPreview to see analytical waves preview. This actor will not be visible it’s only there for generating the texture of ocean waves that the surface will use.
  2. Add ocean BP_FluxSurface on your map. Set parameters:
    – borders that will be blended with wave texture
    – the hardness of blending,
    – texture actor added in the previous step.
    – Change SurfaceMeshTransform.Scale parameter to (2.0, 2.0, 1.0) to see ocean waves around the simulation.
  3. Add simulation area and set reference actor to use the ocean surface, the same way as in the surface tutorial).
  4. Add BP_FluxModifierWaveActor modifier. Change modifier component settings:
    Scale of modifier component to cover the simulation area
    SurfaceHeight to generate water at a certain height
    StateAreaBorders to select which borders should generate water
  5. Add distant meshes that will use lightweight material for rendering the far-distance ocean
    – add plane meshes SM_FluxPlane256x256, scale it to 1000×1000 units.
    – add meshes to list DistantMeshes in BP_FluxSurfaceOcean  

The Projection FOV is a user-friendly and configurable system designed for projecting textures onto a geometry. Useful for:

  • implementing area of visibility in 3D games
  • projecting colorful textures on the environment
  • simulating shadow map in toon shading systems
  • simplified fast cheap alternative for the spotlight


  • camera field of view control combined with depth test
  • supports two types of material rendering (mesh-based and decal based)
  • frustum area attachment debug
  • advanced coloring and texture mapping
  • cinema-like projector
  • multiple types of shape projection (rectangle, circle, masked)
  • can be attached to sockets and bones
  • configurable projection receivers
  • distance-based coloring
  • camera direction and game type independent




Getting Started

Basically, it’s good to start with an example demo that presets all features and a wide range of possibilities of implementing the system in the game.

The simplest way is to use Projection FOV is:

  1. Drag and drop BP_ProjectionFOV on the map. 
  2. Adjust actor to your scene or attach it to actor using (AttachTo function)
  3. Chose the material and set it to ProjectionMaterial attribute. You can use any type of material from templates ProjectionFOV/Materials/Templates

Projection FOV

There are two types of projection that can be used with the Projection FOV system – decal projection and mesh projection. The type of projection is detected automatically based on the master material used in the blueprint. Both have some special features and some limitations that need to be addressed. 

Decal projection

Implemented in master material M_ProjectionFOV_Decal

  • Can be only used with BP_ProjectionFOV  actor because requires a decal component.
  • Can be filtered by “Receive decals” so you can disable the rendering effect on some meshes. 
  • Uses translucent blending
  • Good for rendering area of sight
Decal projection material used in Projection FOV blueprint for presenting the area of visibility.

Mesh projection

Implemented in master material M_ProjectionFOV_Mesh

  • Works pretty well with normal fading
  • Can be used on static meshes (SM_FrustumInside or SM_Light)
  • Uses multiplication blending
  • Good for simulating spotlight (car lights)
Mesh projection material used in example car blueprint as multicolored spotlights.

Material parameters

Basically, the Projection FOV comes with predefined material instances (ProjectionFOV/Materials/Templates) configurated for use in some specific cases.

Users can create their own materials as well by creating a copy or material instance from master materials M_ProjectionFOV_Decal/M_ProjectionFOV_Mesh

Material parameters can be also modified dynamically during gameplay ProjectionInstance in ProjectionFOV. 

Changig base color of projection in gameplay using material instance reference.

Solder is a short RTS style game concept placed in the futuristic science fiction world inside the computer on an imagined PC Board.
The tech demo is an introduction to basic RTS mechanics based on a parasite chip called SOLDER that evolves during the journey on PCB.

Infinity Weather is a powerful and clean system designed for weather control in Unreal Engine.

The package is a combination of 7 systems that could be divided into separate packages: wind, displacement, landscape, precipitation, footsteps, fog, post-process now available as a configurable unified system.



  • configurable displacement capture blueprint
  • top-down projection of displacements rendered using shape definition in the shader (sphere, capsule, box, cylinder, decal, trace-sphere)
  • skeletal mesh displacement supports
  • interface for easy integration with all types of actors
  • area of displacement move dynamically with the actor or camera 
  • time-based accumulation of snow
  • build functions can be used for multiple other effects (like grass interaction)
  • displacement material functions ( snow, mud, sand, grass)
  • small world texture (1024×1024) can handle even 150x150m area


  • two materials with fast dynamic switching between material permutations 
  • post-process effects (raindrops, frozen, sharpen/blur)
  • pre-translucent effects (rain circles, heat haze, experimental glitter)
  • reacts dynamically on weather conditions
  • supports character and sequencer camera


  • notify based footsteps detection system
  • example footstep types (SFX, VFX) snow, rain, mud
  • physics material support
  • configurable footstep component adjustable per character
  • footstep volumes with priorities and conditional spawning


  • defined effects of snow, rain, dust, hail
  • GPU friendly and efficient, implemented on single material and mesh.
  • up to 65000 particles per emitter with high performance
  • occlusion maps
  • crossfade wind and rain sound effects


  • basic example materials (snow, mud, sand)
  • advanced landscape material prepared for mixing displacement and dynamic weather.
  • virtual texturing and layered landscape materials.
  • multiple example ground layers (rock, grass, mud, sand)


  • weather controller actor
  • spherical volumetric fog area and atmosphere and temperature
  • weather surface material function (snow/wet layer)
  • spline-based rainbow mesh with camera facing,
  • water puddle decal
  • directional light clouds shadow reacts on wind direction.
  • wind reactive effects like precipitation, ground dust, flag, trees, bushes, grass, emitters
  • example maps: Snow, Sand, Rain, World
  • character reaction to strong wind






Project config

Make sure that the Infinity Weather package version is up to date. The current newest version is described at the top of the documentation.

The package requires a procedural mesh component to work so make sure that it’s enabled.


I’ve decided to make this project downloadable content that can be added to the project. Unreal Engine does not support sending config files in downloadable content so users have to add footsteps configuration in the project that uses pack to make it work properly.  The process is really simple:

  1. Open the project configuration Edit->Project Settings
  2. Select tab Engine->Physics and scroll down
  3. Add physical surface types: Snow, Sand, Mud, Rock

The package works perfectly with the TPP input config files. if you have used another template then probably you will have to add inputs into your project for package testing: 


The package is now ready!

Getting started

Every user of this package should start by checking all examples delivered with the product. Example maps can be found in the InfinityWeather/Demo/Maps folder.

There are available multiple examples that preset different configurations:

  • SnowMap – simple snow effects with lightweight landscape material.
  • RainMap – simple rain and mud effects with lightweight landscape material.
  • DesertMap – simple desert and sunny weather with lightweight landscape material.
  • WorldMap – advanced multi-layered material with mixed landscape effects.

The example character interaction with the Infinity Weather world is implemented in Demo/Mannequin/BP_DemoCharacter by four basic components added to it:

  • BP_InfinityFootstepComponent – spawns footsteps effect from notifiers
  • BP_InfinityDisplacementComponent – renders shapes for ground displacement 
  • BP_InfinityPostProcessComponent – controls post-process and controller communication.
  • BP_InfinityPawnComponent – additional effects on a character like breath particle when it’s cold.

Weather Control

The most important element of the pack is the InfinityWeather/BP_InfinityWeatherController blueprint that gives the users possibility to control the weather conditions like fog, wind direction, precipitation, and accumulation.

Drag and drop BP_InfinityWeatherController on your level to start controlling the weather.

Default properties of the world weather can be set in the BP_InfinityWeatherController

Atmosphere effects requires Exponential Height Fog plugged into the BP_InfinityWeatherControllers to work properly. Remembet to polace height fog actor on your map and set the attribute of weather controller.

A list of weather controller attributes looks simple but it’s a very powerful tool. For testing try to set some rainy windy weather using parameters like below:

The system can be controlled dynamically by blending between parameters during the game using the functions below.

Functions of weather controller implemented for controlling the weather dynamically.


The precipitation system is the most advanced part of this pack. It is based on a single material combined with a single mesh (no Niagara or Cascade). The effect of billboarding is computed in vertex shader in the local space of the camera. That means the system has almost no PCU cost and works ultra-fast.

 There are a few predefined effects of precipitation defined in the project (/InfinityWeather/Precipitation/)

  • Rain/BP_InfinityPrecipitationRain
  • Snow/BP_InfinityPrecipitationSnow
  • Dust/BP_InfinityPrecipitationDust

The precipitation effect can be used without a weather controller. You can drag and drop the blueprint of the chosen effect on the map to use it as static volume.

Rain and snow precipitation were placed on the map.

A manually placed actor can be configured and limited by the area of precipitation. Users also can set multiple weather types visible locally in some areas of the map. All parameters are described in editor hints.


You can create your own precipitation effect classes for specific rain/snow configuration by extending the base class and changing attributes. That newly created class will be available on the Precipitation Effect Class list in weather controller.

The /InfinityWeather/Precipitation/BP_InfinityPrecipitationOcclusion is a specific type of actor that contains render target texture used for calculating precipitation collision with the roofs and other cached meshes. After placing this actor on the scene you will notice that in the area of the occlusion map the rain and snow are not rendered under the meshes.

The white box is the occluder. Occlusion map preview rendered over the mesh. Notice that there is no rain under the occluder.

Currently supported is only one occlusion map per level but its planned to implement switching between multiple occlusion maps dynamically.


The wind is based on the integration of multiple effects that react to a simple wind direction vector.

Currently, the wind effects can be controlled globally on the full scene using the Wind Direction vector in BP_InfinityWeatherController.

Most of the objects that react to wind use the Environment/Functions/MF_InfinityWind material function that returns all important information about current wind status. It’s the simplest way to get global wind data.

List of implemented effects and objects that reacts to the wind force.

  • Precipitation direction – Attribute Wind Force in precipitation blueprint scales how much it affects
  • Vegetation – MF_WindBush, MF_WindTreeLeafs, MF_WindTreeTrunk, MF_WindGrass nodes implements the effects of wind on vegetation.
  • Flags – specific implementation of flag material M_Flag.
  • Cascade particle emitters (Wind Affected Emitters attribute in Weather Controller is a list of emitters that should use Wind Direction and Wind Intensity parameters to react. (Example Environment/Dust/PS_SnowBlowingLarge)
  • Dust – predefined planar dust effects that rotate and fade based on wind direction.
  • Grass – MF_WindGrass material node.
  • Landscape dust – MF_GroundDust effect on the sandy landscape. 
  • Clouds shadow – M_CloudsShadow



The BP_InfinityFog actor can be used for changing fog settings in some areas.  Fog actor is divided into two parts

  • Fog volume – The volumetric shape of fog is rendered as overlay mesh.
  • Fog atmospherics – Atmospherics settings. When the camera is inside the Fog volume the controller uses those settings to blend into new parameters based on the Weight attribute value.

Currently supported is only the ellipsoidal shape of fog.


Accumulation is the group of parameters that controls coverage of snow and wetness of meshes that uses the MF_InfinityWeatherSurface node in the master material.  

MF_InfinityWeatherSurface is an advanced node that adjusts base color, normal, and roughness values to current weather conditions set by the two basic parameters that are used in the material.

  • Weather.Controller.WetSurface – Controls wetness of the surface
  • Weather.Controller.SnowySurface – Controls the snow shell on the surface.

An example of MF_InfinityWeatherSurface use is presented in the rocks material M_RockSnowMaterial. Drag and drop rocks mesh(/InfinityWeather/Demo/Environment/Rock/SM_Rock ) on your map to check how weather conditions affect the material. Notice that the combination of wetness and snowy material can result in a nice-looking effect of icy snow.

Left top: Wet Materials = 0.0, Snowy Materials = 1.0 Left bottom: Wet Materials = 0.0, Snowy Materials = 1.0 Right top: Wet Materials = 1.0, Snowy Materials = 0.0 Right bottom: Wet Materials = 1.0, Snowy Materials = 1.0

Additionally, meshes can be painted by vertex color (red channel) to mask the effect on meshes that are hidden under the occluders.

A more detailed description of additional parameters can be found in the landscape and displacement section.


The Infinity Weather system introduces an advanced displacement material system based on render targets. In short, the BP_InfinityDisplacementCapture actor is searching for actors around the focus point, that implements BPI_DisplacementShape. BPI_DisplacementShape implementation adds the list of shapes (a type of shape and transform) to the stack. In the final stage stack of shapes is rendered to displacement texture. The iteration is repeated every frame with some random offset.  

3 elements are needed to make the system work:

  1. Displacement receiver landscape with the material that supports displacements. (MI_LandscapeSnow/MI_LandscapeMud/MI_LandscapeSand, MI_LandscapeCombined)
  2. Displacement capture actor placed on the map (BP_InfinityDisplacementCapture), set to capture landscape ground.
  3. Displacement mesh actor or component that will affect the ground (BP_InfinityDisplacementStaticMeshActor)

The video below shows how to combine all these tools and make them work:

Landscape materials

Landscape material is an advanced topic because requires basic knowledge about an unreal material system to inject them into the project.

Example content (Landscape/Materials) comes with a few examples of the landscape configurations to make this step easier. There are simple materials that cover the landscape fully by a single type of displacement:

  • M_LandscapeSnow (uses MF_InfinitySnow)
  • M_LandscapeMud (uses MF_InfinityMud)
  • M_LandscapeSand  (uses MF_InfinitySand)

There is also an advanced version of the material that is a combination of all three effects in one landscape. Additionally, the extended version supports the MF_InfnityWeather. Available in two versions virtual texturing and multilayered 

  • MI_LandscapeCombinedLayered
  • MI_LandscapeCombinedVirtual 

Let’s take a look at the example Sand (M_LandscapeSand) material used on DesertMap:

The material is a combination of four nodes:

  1. MF_GroundDand – it’s the base material layer of sand that contains a color map, normal map roughness, etc. It can be replaced by any material for example created from GameTextures/Example Project/Any pack.

  2. MF_InfinityDisplacement – is a base node that reads displacement render target and returns data from it. As an input, it takes the displacement layer intensity. It’s the layer that will be used to paint the landscape.
  3. MF_InfinitySand – combines data from displacement and ground layer. Additionally, this node implements some additional effects like coloring displaced ground. There are another two nodes that could be used here to achieve another effect MF_InfinityMud, MF_InfinitySnow.
  4. MF_InfinityWeather – adding the overlay effect of a wet surface that can be controlled by the weather controller.

Displacement capture

The BP_InfinityDisplacementCapture is the main actor that prepares displacement depth textures for landscape. Below is a simple explanation of how the algorithm of this actor works:

  1. Displacement capture actor is searching for actors that overlap the area of displacement.
  2. If an actor that supports the displacement interface is detected then he is asked about list shapes to displace ground.
  3. New generated shapes are added to the stack.
  4. In the final step, the displacement capture actor renders all shapes added to the stack.  

A simple configuration of the scene with displacements is presented below:

  1. Drag and drop BP_InfinityDisplacementCapture on the map.
  2. Add the Landscape actor to BP_InfinityDisplacementCapture.GroundMeshes.

    it’s an important step to notify the system about the receiver of the displacements.

  3. Select landscape and use the material that supports displacements for example M_LandscapeSand.

  4. Edit the landscape and paint the layer of displacement on it.

For optimization, displacements are rendered only in the area of the focus point. The focus point is taken from BP_InfinityDisplacementCapture.CaptureActor. If the CaptureActor is null then it uses the character Pawn as the focus. If Pawn is null then it uses camera location.

Two types of render targets are defined in the system. Both area sizes can be changed in the capture displacement actors.

  • Capture Render Target  – cache shapes around the focus point. Modify the CatureTextureSize attribute.
  • World Render Target – combines all cached data. Modify the render target TR_Persistent.resolution to adjust the size of history.

Simple static shapes

The displacement map rendering is based on ray tracing shapes in the shader to get the best possible efficiency but it’s also limited to the number of shapes predefined in shaders. The shapes that can displace the ground are defined in the /InfinityWeather/Displacement/Blueprints/Shapes/ folder:

  • BP_DisplacementStaticMeshCapsule
  • BP_DisplacementStaticMeshCube
  • BP_DisplacementStaticMeshCylinder
  • BP_DisplacementStaticMeshSphere
  • BP_DisplacementDecal (heightmap mapping on the ground)

All of them can be placed on the map and scaled.

Use Debug Option in BP_InfinityDisplacementCapture to preview shapes that are rendered by capture displacement

Character displacement

BP_InfinityDisplacementComponent is a component that allows attaching a list of shapes that affect displacement ground.

The example implementation of displacement component in clean mannequin character presented on the video below:

Adding displacements to your character is easy:

  1. Open character blueprint, add BP_InfinityDisplacementComponent to your character.
  2. Select the component and set Displacement Shapes data asset. You can use full ragdoll  (DS_MannequinRagdoll) or foot (DS_MannequinFoots) implementation that is faster and prepared for walking.
  3. Select “Class Settings” on the top bar and add the BPI_DisplacementShape interface to the list of Implemented Interfaces. It should look as on the screen:
  4. Now implement the interface function called AddDisplacementShapeData.

After those few steps character will be detected and ready to work.

Custom collision type

The Infinity Weather displacement capture system uses box collision volume during searching the actors that should be captured in displacement buffer. The overlap detection requires a specific collision type for filtering collision shapes. The engine has no option to include custom collision type into the external marketplace package because it’s defined in the configuration files.

By default, Infinity Weather takes advantage of using predefined destructible collision type but it’s not the best solution for every project, this collision type can be occupied by other functionalities and turned off for specific actors like characters.

The best way to overcome this problem and also improve performance during overlap test is by creating a custom collision type and using it to detect shapes that should be detected in the displacement area This solution will also fix all issues with ALS ragdoll detection and other strange behaviors that could occur after placing BP_InfinityDisplacement actor on the map.

Creating custom collision type is described in the Unreal Engine 4 documentation and the explanation below shows how to apply this knowledge with Infinity Weather:

1. Find  Edit->ProcectSettings->Collision

2. Open the ObjectChannels tab and add a New Object Channel type called InfinityCapture“, set to ignore by default.

Adding custom collision type.

Now you have to set a new collision type in the displacement capture actor.

It’s good practice to inherit BP_DisplacementCapture class and edit the child class instead of changing package assets. If you do tit this way then the next package update will not break your project.

  1. Open BP_DisplacementCapture (or child instance) and find CaptureVolume component.
  2. In CaptureVolume->DetailPanel->Collision->CollisionPresets->ObjectType set the InfinityCapture type.
Object type set to Infinity Capture in displacement capture actor.

Since now every actor that has set Collision Response->InfinityCapture->Overlap true will be considered during the displacement detection process.

  1. Find character class select collision component that should be detected (CapsuleComponent or Mesh)
  2. In details panel find Collision tab ->  Collision presets
  3. If the collision preset type is “Custom…” then just switch InfinityCapture->Overlap to true.
  4. Otherwise, you will have to set custom or edit existing preset the same way in project collision settings.
    Infinity Capture Overlap set to true in a captured actor (Pawn or shape).
  5. Remember to repeat these steps for all actors that should be detected by displacement capture like BP_InfinityDisplacementStaticMeshActor.

You can use debug option in BP_InfinityDisplacement actior to see if the actor is detected.

Displacement data asset

The configuration of displacement shapes attached to actors is stored in Data assets (PDS_DisplacementShapes).

Infinity weather contains a few examples of data assets for mannequins and vehicles but any user can create custom data assets that will work with specific skeletal mesh. Creating a displacement data asset is simple:

  1. Press the right mouse button in the content browser
  2. Select Miscellaneous->DataAsset
  3. Pick PDS_DisplacementShapes and create
  4. Set name of newly created data asset “DS_ExampleDataAsset” for future use.

Newly created data assets can be used in the characters but we still need some shapes definition. Shapes can be added by hand but Infinity Weather comes with a simple editor that helps preview how shapes are attached.

Displacement shapes editor. Ragdoll in edit mode.
  1. Drag and drop editor class on map (BPU_InfinityDisplacementShapesEditor).
  2. Select editor, and chose newly created data assed (DS_ExampleDataAsset) in Data attribute.
  3. Pick the skeletal/static mesh reference that will be a visual representation of the shape that makes displacement. It can be your character or car.
  4. Press Load and now you can add shapes to the List. The shape structure is described below.
  5. After the working press “Save” to save changes. Remember to apply the newly created data asset in DisplacementComponent.
Shape property Description
Socket Bone or socket name used as an attachment transform. The system will uses component transform if the socket name is not defined.
Shape.Type Shape type. Currently supported: Box/Sphere/Cylinder/Capsule/Decal, Trail Wheel, Trail Sphere
Shape.Intensity Scale the intensity of interaction (not implemented yet)
Shape.Transform Relative transform in space of attachment.
Shape.Pattern The texture is used as a pattern in the displacement decal.


Weather System also supports the advanced footstep system integration based on notifiers placed in animation.  

Working with footstep system:

  1. Add BP_InfinityFootstepComponent component to a character. The configuration of components contains default predefined templates of effects and attachments for the mannequin.
  2. Add BP_NotifyFootstep to the animation of walking/running in place when the foot hits the ground. Chose the left or right foot in the properties.
  3. Add BP_FootstepVolume on your scene and select the preset of footstep that should be spawned inside of the volume.

Footstep volume

The engine sometimes wrongly detects the footstep ground because of limited landscape layers blending during ground tracing. Footstep volumes BP_FootstepVolume can solve this problem easily. When the character is inside the volume then the system uses the footstep set in the volume definition so the incorrect footstep effect is overridden.

The priority and Required conditions are taken into account to choose the most relevant footstep effect. It can be even filtered. The example below shows how to config volume on second priority (higher priority more important) that spawns snow and will be visible only when there is at least 0.1% of the weather displacement active.

Screen Effects

The BP_InfinityPostProcessComponent placed into an actor that contains the camera component will add automatically the post-process material and communicate with the weather controller to get info about the current conditions.

Three types of post-process views objects are supported:

  1. Postprocess camera – by default system search for the camera in owner.
  2. Postporcess component – if there is no camera in the owner then the system search for a component.
  3. Postprocess Volume – custom volume can be set for the character on the scene by setting the Postprocess volume attribute.

A custom post-process object can be also set by the function SetPostProcessView.


  • Raindrops and circles show only when the precipitation actor has IneractWeather > 0.0. That means raindrop shows only inside the rain volume.
  • Screen raindrops fade after some time. The duration of the fade can be controlled by BP_InfinityPostProcessComponent.Rain parameters.
  • When the camera is under the cover it’s not getting wet and raindrops are not appearing.
  • Freezing and heat haze distortion are calculated locally from the weather controller and fog actor’s atmosphere. (temperature, distortion) 
Raindrops post-process. The effect is activating when the weather is rainy.

The screen effects are implemented using two post-process materials.

MI_InfinityPreTranslucent – a group of effects rendered before the translucency layer.

  • Rain Circles – Enables automatically when rain precipitation is active.
  • Distortion – Heat Haze effect that enables when 
  • Glittering – Experimental disabled effect of glittering rendered on snow and sand.

MI_InfinityPostProcess – a group of effects rendered after the tone-mapping.

  • Sharpening – Additional sharpening effect that improves the quality of close objects (can be disabled) by UseSharpening.
  • Drops – Animated raindrops on-screen are visible only if the camera is exposed to rain.
  • Frozen – Frozen screen edges.


The replication of the weather is implemented on the server-side. That means all functions from BP_InfinityWeatherController that controls weather should be activated on the server and then the state will be automatically sent to clients. The state of the weather is also replicated when the player joins to game after some time.

Displacement render targets are not replicated but only calculated locally around the focus point of displacement capture actor that follows the local player 0. That means if multiple actors are walking in the same area at the same time then the result will be calculated locally and similar for all of them. It behaves this way not because it’s replicated but because it’s simulated locally with the same data.

Two players connected in a multiplayer game and displacements.

During testing displacements in multiplayer be sure to run it in separate instances of the game, otherwise, UE4 shares the memory of ground texture and there are bugs because multiple displacement actors render the same displacement texture at the same. This issue does not affect the final released game so don’t worry. Use option SingleProcess=false for testing the game in the editor with proper displacements in multiple windows.

Testing multiplayer setup.

Future features that will be implemented for multiplayer games:

  • Local weather controllers are not supported yet but it’s something on my roadmap.
  • Replicated foodprint decals that will be sent to clients and activated when displacement actors close.


Advanced Locomotion System

Collision setup

The most important part is implementing a custom collision type that will solve most of the problems related to ALS and IW volume overlapping. You can find detailed descriptions of this step in the chapter called “Custom Collision Type”.

After you finish this collision configuration then additionally it is worth setting Climbable  = Ignored in Trace Responses of Capture volume component because it’s not needed. The final configuration of the InfinityCapture actor should look like below:

It’s also important to update ALS_character collision preset in the project config. InfinityCapture channel should be set to overlap.


When the collision setup is ready then it’s time for ragdoll. During ALS ragdoll mode capsule overlap is not detected and that is why displacements are not rendered by default. There are two possible methods to fix it and choose is on your side:

  1. Select ALS Character Mesh and switch Generate Overlap Events = true, and InfinityCapture channel to Overlap.
  2. Alternative more general method. ALS_Character -> Add Component ->Sphere Collision to character actor and configure attributes as presented on the screen below:

There is another important thing about ragdolls that is switching between presets ragdoll/feets.

The DS_MannequinRagdoll preset is slower and more precise, the DS_MannequinFoots preset works perfectly in low FPS because it supports continuous movement. Fortunately, ALS can support this feature very easily ALS_AnimMan_CharacterBP has functions RagdollStart and RagdollEnd that perfectly fits the requirements of switching.

Use SetDisplacementShapesDataAsset on the DisplacementComponent.

  • RagdollStart should set DS_MannequinRagdoll asset as a parameter
  • RagdollEndshould set DS_MannequinFoots asset as a parameter





Voxel Plugin

Coming soon


Coming soon.

Questions & Answers

I can’t see any precipitations under the 0.0 of the level.

Add BP_InfinityPrecipitationOcclusion on your map.

Explanation: The system by default reads 0.0 as the default occluder. BP_InfinityPrecipitationOcclusion will analyze your level occlusion and update the height of the ground.

Why landscape is flickering after painting the displacement

Solution: Use a geometry brush on the landscape to displace mesh a bit and mesh will not disappear anymore.

Explanation: It's an engine bug, that I can't overcome but it's easy to fix in your project. The only way to fix this is to increase the size of the landscape component bound box by editing the ground. The bug is related to the bounding box of landscape fragments. If landscape bounds are hidden under displacement mesh (extruded using vertex offset|) then the occlusion system removes the fragment of landscape. It flickers because the occlusion is probably calculated based on the previous frame and in the previous frame, the component visibility was the opposite.



Lightning Fast is a powerful combination of materials and blueprints that implements a wide range of realistic and stylized electricity effects.

Lightning effects have plenty of uses in games, from background ambiance during a storm, electronic fences to the devastating lightning guns and spells. This system is designed in mind to achieve all those effects with high performance, AAA quality at the same time.


Supported effects:

  • lightning flashes
  • lightning bolts
  • lightning chains
  • lightning strikes
  • lightning discharges
  • lightning beams
  • lightning trails
  • lightning guns
  • lightning electrocute


  • GPU friendly and optimized for all platforms
  • Wide range of high-quality effects
  • Two types o beam rendering (spline mesh/spline billboard mesh)
  • Integrated with UE4 dynamic lights and material light functions and splines
  • Over a hundred parameters in materials to customize the final effect
  • Spline based shape of bolt easy to adjust to the scene
  • Character electrocute discharges effect
  • Advanced lightning mesh editor tools
  • Multiple animated lines rendered in a single beam
  • Depth fade/direction fade effect
  • Glow and line color
  • Refraction based distortion
  • Procedural beam texture
  • Dynamic branch masking and fading effect

Additional content:

  • prototype pack – textures and materials created for demonstration purposes
  • mesh generator – advanced tool prepared for editing and generating lightning meshes, allows implementing external lightning propagation class
  • demo example map – a showcase of 14 use cases
  • Procedural mesh lib – Additional library of functions that helps to create static meshes in UE4



Getting started

Basically everything starts from the geometry (meshes) and effect (materials) rendered using geometry.

The Lightning Fast mesh combined with the material has some useful features and properties that are very important in the production.

  • can be used witch spline meshes
  • can be skinned and used as skeletal mesh 
  • can be scaled and keeps the proper shape
  • can be used in a particle system
  • can be watched from every angle

That gives us a wide range of effects that the system can handle so the package is divided on multiple blueprints specialized in custom effects.

All of the effects can be found in the Blueprint folder and are presented on ExmapleMap (Demo/Maps/LightningFastMap)

Example use-case:

  1. Find folder Blueprints/LightningBolt 
  2. Drag end drop BP_LightningBolt on scene
  3. The effect is ready to use. Now you can adjust parameters to get the effect that you need.



Blueprints folder contains multiple lightning effects that show how the system works with materials and meshes is should be treated as example content or templates prepared for general use cases. Some of the effects contain blueprints the other are represented only by particle effects used that main character is using.

Check the Demo/Manequin/ThirdPersonCharacter blueprint to analyze the implementation of the skills.

The list of effects will be extended in future updates. 

It’s good practice to create your own effects by inheriting Blueprint classes and copy material instances to your project folder.

Lightning Spline

The Blueprints/BP_LightningSpline blueprint is a simple combination of mesh spline effects and lights. You can combine any mesh created using Lightning Fast package with splines even bending the lightning bolts is allowed. 

Drag and drop BP_LightningSpline on the scene and modify parameters. [Editing splines]

Mesh Description
Static Mesh Mesh used to bend
Material The material used on mesh
Forward Axis The axis of mesh that is directing forward of bending.
Translucency Sort Priority Rendering priority
Fix Spline Direction Additional correction of spline UP direction. Fixes some twist bugs.
Texture Scale Scale texture UV for beam start and end.
Glow Color Override material glow color.
Line Color Override material line color.
Lights Description
Lights Count Number of lights created for spline
Light Snap Curve Importance of snapping the lights to the curve (0-1)
Light Color Override color of light
Light Falloff Fall off parameter of light.
Use inverse squared falloff Fall of the mode switch.
Light Function The light material function used for blinking

Lightning Bolt

BP_LightningBolt is a complex example of lightning bolt effect implementation based on lightning spline blueprint. The biggest advantage of this blueprint is that can be easily integrated with every weather system.

BP_LightningBolt Description
Animation Bolt animation curves describe how parameter changes during the update. R – time offset, G – fade, B -(not used yet), A- lightning intensity
Animation Rate Speed of animation update.
Activation Loops -1 (infinite), 0 – not active at start, 0> loop counts
Activation Delay Delay until the first activation.
Preview Time Draw debug preview at some time of animation.
Retrigger Time Random time in range (x-y) that system wait until next activation.
Use Randomization Whether the system should randomize rotation and location in Random Volume.
Rotation Range Range of rotation change during randomization.
World The world that manages the light intensity on the scene

Use parameter World to register blueprint in BP_LightningWorld. The BP_LightningWorld actor is the manager that changes directional light intensity and skylight intensity during the stormy weather according to all managed lightning bolts.

To activate Lightning Bolt by hand (from events) use Activation function and set ActivationLoops = 0.

Lightning Beam

Dynamically updated chain of splines that connects target points in the closest range.

Simple use case of working with a beam:

  1. Drag and drop BP_LightningBeam on scene
  2. Drag and drop a few target actors (BP_lightningTarget) on the scene near to the lightning beam actor.
  3. Select BP_LightningBeam and edit parameters:
  4. Find InitChain array and add one element. Setup the starting chain element. 
    Set TargetActor to self it will make the beam actor the source of effect.
  5. Play and watch how the beam connects the target actors that are moving around.

Basically the concept of BP_LightningBeam actor is based on the idea that the chain should be represented by multiple nodes connected by spline meshes in a specific order. The list of nodes and order can be set by the user or generated automatically. 

The BP_LightningBeam effect is divided on the static and dynamic parts.

  • Static (set by user) – predefined, represented by InitChain structure created by the user during activation.
  • Dynamic (generated) – generated starting from the last element of the static fragment using an algorithm of searching in range.

InitChain is the array that describes the connections between chain elements. The first element of the array is a root (starting point).  For example, the Init Chain of lightning guns can be represented by 2 nodes. The first node is the gun muzzle and the second one is the hit point. The ParentIndex is the index of the parent node in the InitChain.

You can even specify a looping shape in init chain by this kind of config:
InitChain[0] = (TargetActor=Target0, ParentIndex=0) //start, points to self so it’s empty
InitChain[1] = (TargetActor=Target1, ParentIndex=0) //spline from element 0 to Target1
InitChain[2] = (TargetActor=Target2, ParentIndex=1) //spline from element 1 to Target2
InitChain[3] = (TargetActor=Target0, ParentIndex=2) //spline from element 2 to Target0 (loop)

Init Chain Node Description
Target Actor The actor used as node in the chain. The algorithm uses the location of the actor as the endpoint of the spline mesh.
Parent Index The index of element InitChain array where the algorithm will look for the starting point of the spline mesh. InitChain[ParentIndex].TargetActor is used to set the location.
Target Location If the target actor is null then Target Location is used.
Target Normal Overrides the normal direction of spline mesh. It can be used to change the shape f spline.
Source Direction Overrides the source direction of spline mesh. It can be used to change the shape f spline.
Null Target How to react if the actor is destroyed/null:

  • None – use TargtLocation and do nothing
  • Deactivate – deactivate the actor
  • Generate – regenerate the chain using dynamic generation.
  • Remove – remove the node and work as before.

The last element of the InitChain is the starting point of searching. The searching uses the attributes in the Search tab to specify which actor should be in the dynamic chain. 

After the InitChain is specified then there it comes to dynamic chain generation. ChainLength is the number of spline meshes that can be rendered by BP_LightningBeam actor. The ChainLength attribute is used to calculate how many additional nodes should be generated. The number of the dynamic nodes that algorithm will look for is ChainLength reduced by the number of renderable connections specified in the Init Chain.

Example situation (lightning gun)
ChainLength = 4
InitChain contains 2 elements, the connection between start point(muzzle) and one target point is represented by 1 spline mesh.
Finally, algorithm will look for 4-1 = 3 target points and try to render 3 additional spline meshes.

  The algorithm of searching is simple:

  1. Find closest target actor in range to the last element of the chain
  2. Add the element to the chain.  
  3. If CombinedChain is smaller than ChainLength then goto 1

After the searching process, all the nodes are linked together in a chain 0-1-2-3 … -n. UseClosestToChain=true option runs an additional update that changes the topology of the chain to render the shortest possible spline meshes. Basically it looks for closest connections between nodes.

Lightning Gun

A complex example of Lightning gun implementation based on Lightning Beam blueprint. 

Example Lightning Gun use implemented in the demo character Demo/Mannequin/ThirdPersonCharacter

Lightning Discharges

Particle emitter that presents discharges effect.

Lightning Trail

Effect of trail attached to the actor example content was used as a ribbon behind the character during a slow-motion run.

Example Lightning Trail used to show character ribbon during slomo run. Implemented in the demo character Demo/Mannequin/ThirdPersonCharacter

Lightning Target 

Object targeted by lightning gun and other skills. Target also spawns an example discharges particle when it is attacked by a lightning beam or lightning gun.

Lightning Source

Folder contains a base class of dynamic light source components and light materials.


M_LightningFast  is the heart of the package and all material subtypes derive functionalities from this advanced master material.

Creating material instance in UE4 is really simple. Click right on the material (M_LightningFast) and chose option “Create Material Instance”. After that operation the newly created material will be ready to configure and use on Lightning Fast meshes. You can do the same with all material instances created for specific cases stored in the blueprints folder which I am recommending to do first.

The material has a very high potential for setting up and create a wide range of effects. The following list includes all available configuration options with usage recommendations. Some of the parameters can be hard to describe and understand at first but I encourage you to test it in the engine and check the results of changes. 🙂

Base, Fade

List of general basic options.

Base Description
MeshWidthScale Scales the geometry of billboarding mesh.
Opacity Opacity/Additive mode.
OpacitClamp Opacity clamping for high values.
UseBillboardMesh True – use billboarding system for rendering, false – render original mesh
UseDebugMesh Draw debug mesh
UseRefraction Use the refraction effect.
RefractionClamp Clamp intensity of refraction effect.
Refraction Refraction scale.
UseVertexInterpolators Optimization that forces system to use interpolators (does not work with particle emitters)

Group of parameters that helps to fade effect in some specific conditions. Close to the wall or unpleasant angle.

Fade Description
UseNormalFade Normal fade is the feature that helps to hide beam when its in bad wrong related to mesh. The fade is calculated based on the normal vector of the mesh. For the mesh in billboard mode the normal means axis of rotation. When the Axis is similar to the camera direction then mesh should be invisible because we can notice some bugs.
NormalFadeScale Scale the effect makes it fade faster or lower
NormalFadeOffset Offset the effect to make it more visible. 
Fade Makes effect translucent.
UseDepthFade Whether lightning should fade when intersecting scene meshes.
DepthFadeScale How fast lightning should fade when intersecting scene meshes.
UseVertexFade Vertex color of lightning meshes contains additional information about fade this option allows to use the for fading begin and end of the node.
VertexFadeStart Scale vertex fade start
VertexFadeEnd Scale vertex fade end

Branch, Masking

The lightning beam is build of four branches that can be displayed at the same time or one after one. This section covers the basic settings.

Base Description
UseBranchFading Activate the algorithm of asynchronous fading of branches.
BranchFadngAlpha Alpha multipliers of each branch of beam stored separately in RGBA channels.
BranchFadingOffsets Time offsets in fading animation.
BranchFadingPower Exponential fading speed of endpoint.
BranchFadingSource Fading blend in of start point.
BranchFadingScale The scale of fading UV.
BranchFadingTimeScale Branch fading animation speed.
BranchFadingOffset External controller of branch fading time.
UseBranchCombine Combine branches into one beam. Set False to use separate colors.
UseBranchCombineMax Combine branches by max function. False forces system to use sum function.
UseAlphaChannelBranch Use the fourth branch from the alpha channel.

Masking is the feature implemented for randomly hiding the branches of lightning. Even static meshes can look dynamic and various thanks to masking some of the nodes during the update.

Beam, Bolt

Beam configuration focused on UV mapping transformations.

Beam Description

UV mapping of distortion texture:

  • RG – is the xy scale of beam distortion UV (higher value more dense UV)
  • BA – Speed of UV moving (higher value more dynamic effect)

Additional skew UV offset stretches the distortion.

UseBeamBlend Beam blend is the effect of the fading beam on start and end.
BeamBlendStart The intensity of start beam blending.
BeamBlendEnd The intensity of end beam blending.
UseVertexColorUVOffset Adding UV offset based on vertex color R-value. Parameter helps to make mesh beam look more variable.
BeamUVScaleSpeed Most important parameter

Bolt is the effect that allows animated offsetting of the beam position on the mesh.

Bolt Description

Whether the bolt animation should be active.

BoltStart/BoltEnd Start/end position of bolt.
BoltMoveOffset Sets the offset of bolt animation (can start from any point)
BoltMoveRange Sets the value range of bolt animation.
BoltMoveSpeed Speed of bolt animation.

Distortion, Line

Each beam is created of multiple light lines that are distorted by noise texture. The texture is moving so the effect looks dynamic and chaotic.

Distortion Description
DistoritonScale Scales the distortion effect.
DistortionTexture The texture used for distortion offset and detail distortion offsets.
UseDistortionDetail Whether distortion detail should be active.
DistortionDetailUVScale Scales the UV mapping of detailed distortion.
DistortionDetailScale Scales the intensity of detailed distortion.

Describes basic parameters of the line that is distorted and generates glow.

Line Description
LineColor Color of line
GlowColor Color of glow effect around the line
GlowTexture The texture used for masking glow.
GlowTextureScaleMove Animating glow texture RG-Scale, BA-Move
UseGlowTexture Whether the glow texture effect should be active.

Width of line.


Line disappearing in distance from the start point of lightning.


Material supports two modes of rendering glow effect around line Hard/Soft. The first one based on smoothstep function and the second one based on division by distance. 


Mesh editor

Lightning Fast package contains mesh editor BP_LightningMesh that allows creating unique lightning meshes. Drag and drop blueprint on scene Blueprints/LightningMesh/BP_LightningMesh to start creating a lightning mesh.

 Each mesh is generated from a node list (attribute Nodes array). The node list is an array that contains a graph of connections between nodes described using a special structure called BS_LightningNode

BS_LightningNode Description
Parent This attribute describes the index from an array of the parent nodes. Each node has a parent. If the parent of the node has the same index as the source node than the parent is the root(starting point) of the hierarchy generated from the node list. 
EndPos Location of the ending point. It can be edited visually in the main viewport as well by selecting and using translation gizmo.
Loop A value higher or equal 0 is the index of loop endpoint. The -1 is the default and means that there is no loop.
Width Node width.
Priority Higher priority means that the node describes the main branch.
TangenInOut The scale of the tangent input-output vector.


Lightning mesh during editing on level. Node 0 is the root.

The package comes witch additional tools that help to create unique lightning meshes in the Unreal Engine 4 editor. There are two methods of editing Nodes array.

  1. Editing by hand using Tree Editor
  2. Editing using Tree Generator

Tree editor

Tree editor is the basic method of editing the node list there are multiple buttons that increase the speed of building the new branches of lightning mesh.

Editor Buttons Description
Node Append Append the child to the selected node (selected node index).
Node Insert Divide the selected node (selected node index) by inserting a node in the middle.
Node Remove Removing the selected node (selected node index).
Node Selected Get Load data from the selected index to Selected node Data.
Node Selected Set Save data from Selected Node Data to the selected node (selected node index).
Node transform Transform a selected node by Transform to apply attribute.
Reset Clear the tree.

Tree generator

Lightning Fast has an advanced system of mesh generators that can be used to automatize the creating of wide range meshes. 

Working with generators is simple:

  1. Add generator to the Generator List used by the BP_LightningMesh editor.
  2. Chose the class of generators LTG_Branch 
  3. Click Evaluate All Generators
  4. Double click on the generator instance and edit parameters and click “Evaluate All Generators” again.

BP_LightningTreeGenerator is the base lightning generator class that can be extended by other generators by overriding the Evaluate function. 

Example generators provided in version 1.0 of the package described below:

LTG_Branch – Example generation that creates branched lightning effects.

LTG_Spline – Example generator that creates lightning based on spline shape.

Create static mesh

A generated mesh can be easily converted to static mesh and used in other blueprints and particle effects. Few steps needed to do this:

  1. Select BP_LightningMesh blueprint and find component called ProceduralMesh
  2. Find option UseDynamicMaterial and set to false.
  3. Click button Create Static Mesh
  4. Select where to save the mesh

UseDynamicMaterial = false forces system to use default not instanced material. If you will skip this step then newly created meshes will contain reference to instanced material ant wont save.

Questions & Answers

Integration with Fast Stylized Procedural Sky?

Yes! Last update of Stylized Procedural Procedural Sky supports integration with Stylized Procedural Sky just place lightning bolt on map and activate it on event from build in lightning system.

It is worth buying?

Hell yes! After four mounts of:

  • researching the topic,
  • watching thousands of photos and videos with lightning effects,
  • testing multiple solutions,
  • removing huge amounts of unsatisfying effects
  • hard time spend on iterations and optimizations

I can ensure you now that it's not worth to do it by yourself from the beginning when you have finished solution and my support. I would never start again if I would could... 😉


Ice Cool is an advanced master material prepared especially to create multiple types of ice like ground ice, ice cubes, icebergs, crystals, glass, and icicles. The translucent material option is a great solution for improving your level by storytelling on cold arctic maps. Designed and optimized specially for Mobile, Virtual Reality, and stylized PC/Console games.


  • Customization options. Over 200 parameters to make it look cool. Additionally, Over 50 switches allow controlling efficiency-quality trade-off.
  • Uses the NEW FAST cracks rendering method. Implementation of the new cheap iterative technique of parallax for rendering deep cracks based on Signed Distance Field Textures. Thanks to using the SDF result effect stays clean and smooth even on closeups with using only 7 texture reads! (5 times faster than other methods).
  • Supports multiple types of refraction. Supported cube map with box projection mapping/screen color/ build in refraction pin.
  • Advanced translucency options. Translucency supports depth fading fog for a better quality of covering the objects inside the ice.
  • Subsurface scattering and custom lighting. Useful for more advanced users to better fit the ice into the scene.
  • Extended reflection mode. The system supports pre-rendered cube map reflections mapped on meshes using box projection mapping and spherical mapping. This method is very fast even on translucent materials.
  • Icy vertex offsets. Material supports configurable Vertex displacement for Icicles.
  • Masked dithering. You can configure the system to use translucent material only where you need and dither the opaque material around the wholes.
  • Animated dust. Configurable dust map with refraction noise makes the material look even deeper and real.
  • Efficient and GPU friendly. Uses between 90-210 instructions depending on the number of features enabled can be used on VR and even mobile.
  • Extra tools. The system includes additional tools for generating new Signed Distance Fields and environment cube maps.
  • Multiple useful examples. The package contains an example map with a showcase of multiple samples of using the material.



Get Started

Ice Cool is a pack based on single advanced master material. The material is placed in folder IceCool/Materials/M_IceCool. Basically every user has two ways of working with the package.

  1. Copy Material (Easy one) – Copy example material and modify the parameter to fit the requirements. You can check the example materials in the demo IceMap.
  2. Create Material (Hard one) – Create a material instance from master material and setup material parameters from scratch.

This documentation covers both approaches to create cool ice material.

It is good practice to create New Folder where the materials created using Ice Cool package will be stored. Try to not modify or add any content do the Ice Cool package it will protect your project from problems after downloading the new updates of the package.

Copy existing material

Ice Cool package contains a demo folder with multiple examples of materials of ice prepared for the specific conditions that can be copied to your project. All examples of ice materials are stored in the Demo/ExampleMaterils folder.

  • Cracked ground
  • Half translucent ice cubes
  • Opaque icebergs
  • Translucent icicles
  • Ice coating
  1. Select material that you want to use in your game
  2. Click the right button on the material and choose duplicate (if you want to copy) or create a material instance (if you want to extend the material and modify parameters).
  3. Move the newly created material into your project folder.

Cracked ground

The cracked ground is the most advanced use of the Ice Cool package. This material is prepared to be used as a ground. The UV coordinates of ice texture are calculated in the world space so the material can be used regardless of mapping or ground size. There are multiple types of this material prepared for fulfil the requirements of quality and platform type that game targeting. The demo allows seeing differences between every type of the material by pressing keys 1-7 on the example map.

Material Type Lighting Reflections Platform
Opaque Unlit Material parameters Cubemap All
Opaque Lit Build-in UE4 Reflection Captures All
Translucent Unlit Material parameters Cubemap All
Translucent Lit Build-in UE4 Cubemap PC
Dithered Lit + Translucent Unlit Mixed Cubemap near the camera. PC
Dithered Lit + Translucent Lit Build-in UE4. Best quality. Cubemap near the camera. PC
Dithered Unlit + Translucent Unlit Material parameters Cubemap PC

Half translucent cubes

Half translucent means that this type of ice is a combination of two meshes opaque and translucent.

  1. Inside layer (MI_IceBlockBigInside) Opaque mesh is rendered from inside of the geometry (normal inverted front face culling)
  2. Outside layer  (MI_IceBlockBigOutside) Translucent mesh is rendered traditionally as a layer over the ice.

This kind of materials are useful when you want to put some meshes into an ice cube and achieve good effects of refraction. Mesh will be rendered between the inside and outside layers.

Half translucent ice based on two materials – mannequin mesh placed inside the ice.

Opaque icebergs

Very efficient good looking opaque material useful for icebergs that use refraction based on external cubemap.

Opaque iceberg material in action

Translucent icicles

This material type was prepared especially for creating icicles hanging from a roof. The additional layer of animated dirt makes it looks wet.

Translucent icicles material in action

Create new material

If you are not interested in using predefined materials of ice then there is always the harder path to follow. You can remove all demo content and create the material instance from scratch.

  1. Find the base material IceCool/Materials
  2. Duplicate the MI_IceCool or create a material instance from M_IceCool.
  3. Rename newly created material and move to the Ice materials folder in your project.
  4. Material is ready to use from now.

I’m not recommending you this method. It requires a lot of knowledge about the ice material and this documentation does not cover all details yet.


The reflection effect is based on pre-rendered cube maps. A detailed explanation of how to create cubemap for ice can be found in the Tools/Cubemap Rendering chapter. Ice materials support two types of reflection mapping. Spherical reflections and box projection mapping.

  • Spherical mapping – Very fast but inaccurate. Works out of the box.
  • Box projection mapping – Very accurate in box-shaped rooms. Requires additional room size pieces of information in the materials.
Box projection mapping  
UseBoxProjection Allows using box projection mapping. If disabled then the system uses standard spherical cube-map projection.
ReflectionBoxExtend half size of the room cached in the reflection map.
ReflectionBoxPosition Position camera where the reflection was cached.
UseReflectionBoxLocal Forces system to use reflection box capture in local space of actor position. Useful when the mesh is connected to the same actor as the reflection capture component.

Other reflection parameters:


ReflectionColor Color of reflection allows adjusting the cubemap to underwater conditions. Alpha channel represents the power of Fresnel.
ReflectionTexture Prebaked cubemap reflection texture should be cached in a place where the glass is rendered.
Use Shlick Reflection Enables physical-based Shlicks Fresnel calculations. Otherwise uses fast simplified dot(camera, normal) Fresnel.
UseReflectionBoxLocal When true then ReflectionBoxPosition will be added to the actor position.


The package supports multiple advanced types of refraction. Each type of refraction is useful for a different scenario.

  1. Build-in UE4 refraction
  2. Screen color refraction
  3. Opaque cubemap refraction

Fast dithered layers

The dithered material is the composition of two meshes with the top layer implemented as dither translucent. That combination allows using the slow translucent materials only where it is needed and increases quality of lighting and reflections with low cost by using the opaque material over the translucent.


  • The top mesh layer uses opaque dithered opaque material that has high-quality reflections and lighting implemented by Epic.  
  • The bottom mesh layer uses layer translucent material that is covered by the top layer using a dithered transition.

It’s good practice to use bottom translucent layer only where it is needed and cut invisible parts of this mesh to increase a performance by lowering the geometry overdraw.

Dithered material can be masked by a vertex color alpha channel painted in the vertex paint mode. This functionality is active only when the UseDitheringVertexAlpha option in the top opaque material is active.

Material parameter  
Use Dithering Distance Use dithering in distance from the camera.
Use Dithering Vertex Alpha Use masking of the dithering by vertex alpha.
Use Dithering Per Vertex Calculate distance for dithering per vertex to lowers the shader complexity. It can cause some quality problems when using low-density meshes.
Dithering Border Dithering distance border.
Dithering Smooth Dithering border smoothing.


Parallax cracks

Advanced algorithm of rendering parallax cracks uses the information about the distance from crack to render very efficient and smooth effect in only 6-8 texture reads. In comparison to other algorithms, it is over 6 times faster and looks much better on closeups.

Material parameter  
Use Cracks Whether the effect of cracks should be used in the material. The cost of this effect is noticeable (about 30-40) instruction so it’s good to enable this flag only when intended to use effect.
Cracks SDF Texture Signed Distance Field texture used to generate the effect. Detailed description about preparing SDF textures can be found in the chapter “Signed Distance Fields rendering”.
Cracks Color A bottom color of the effect of the cracks.
Cracks Color Light   A top color of the effect of the cracks.
Cracks Depth Iteration The number of iterations used to calculate the effect. Best results can be achieved in between 5-10 iterations. It’s worth to minimize this value for optimization.
Cracks Depth Step Size Step size per iteration
Cracks Depth Scattering Depth scattering scale.
Cracks Depth Scale Depth intensity scale.
Cracks Depth Smooth Smoothing of the effect. It should be lowered when there is a lot of holes in the cracks.
Cracks Distortion Distortion of cracks based on normal-map.
Cracks Width Width of the effect. It should be between 0.93 and 1.0.
Cracks Height Height of the effect.
Parallax cracks in action


Ice Cool package supports a deep layer of animated dust.

Dust parameters  
Use Dust Whether the dust should be used in the material.
Dust Color The color multiplier for the dust texture
Dust depth shift An offset of the dust layer
Use Dust Layered Whether to use the second layer of dust.
Dust Layer Between Interpolation of layer rendered in between base layer and mesh surface.
Dust Texture The texture used as a dust layer.
Dust Texture UV Scale UV coordinates (X,Y) and scale(Z,W)
Dust Texture UV Anim UV animation speed per second (X,Y)
Use Dust World Space UV Whether to calculate dust UV in world space or take it from UV0.
Use Dust Noise Whether to distort the dust layer by noise texture.
Use Dust Noise Alpha Whether to read distortion of UV dust layer from noise alpha.
Dust Noise Texture Dust noise texture source.
Dust Noise Scale Dust noise texture UV scale.
Use Dust Noise World Space UV Whether to calculate dust noise UV in world space or take it from UV0.


The coating effect extrudes the mesh in direction created from normal and gravity vector. The coating effect is useful for covering geometry by ice with additional icicles hanging from a mesh.

Coating parameters  
Use Coating Whether the coating effect should be enabled
Coating Offset Channels RGB represents the direction of gravity length of this vector represents force. Channel A is a coating size in the direction of normal.
Coating Normal Distortion Changes noisy distortion in a normal direction.
UseCoatingNoise Whether to use noise function to calculate random offsets.
UseCoatingNoiseSkin When true noise will b calculated using Skin vertex position, false force system to use texture UV.
CoatingNoiseScale Noise irregularity
UseCoatingTexture Changes noisy distortion in a normal direction.
CoatingTextureUVScale Coating texture UV modyfikator.
CoatingTexture The texture used for coating effect.
Coating material in action.

Animated coating

Since update 1.1 Ice cool supports animated coating effect that can be useful during freezing meshes.

Multiple types of coating growing animation:

  1. SPHERE – UseCoatingSphereAnimation = true. Coating is growing inside the sphere described by (CoatingSphere.rgb – location, CoatingSphere.a – radius)
  2. PLANE – UseCoatingPlaneAnimatio = true. Coating is growing inside the sphere described by (CoatingPlane.rgb – normal, CoatingPlane.a – plane offset)
  3. FLAT – time based evenly growing. Growing controlled by CoatingFlatBlend float (0 – no coating, 1- full coating )
Coating Animation  
UseCoatingAnimation Whether the coating animation effect should be used.
CoatingAnimationTime Flat coating animation effect blend between no coating and full coating. Should be modified externally by blueprints.
CoatingAnimTranslucent Speed of showing the coating effect in the translucency channel.
CoatingAnimType Displacement mode type. 0 – firs type, 1 – second type (just check the difference in showing icicles and shell)
CoatingShapeDistanceBlend The soft blend between no coating effect and full coating used in a plane and sphere mode.

The coating is growing inside the sphere described by   Coating Plane

CoatingPlane CoatingPlane.rgb – normal, CoatingPlane.a – plane offset. Should be modified externally by blueprints.
UseCoatingSphereAnimation The coating is growing inside the sphere described by Coating Sphere
CoatingSphere CoatingSphere.rgb – location, CoatingSphere.a – radius radius. Should be modified externally by blueprints.
UseCoatingShapeLocalPosition Use local space position of vertex during sphere/plane calculations.
UseCoatingDebugAnimation Show example debug animations. Disable when you want to animate manually in blueprint by using material parameters.


Ice Cool package contains additional tools that can help to use the full potential of the material system.

Cubemap reflections rendering

Materials use cube maps for simulating reflection and refraction of opaque materials. Those cube maps can be created using BP_CaptureCubeMap blueprint. 

  1. Place Blueprints/BP_SceneCaptureCube on your map and set up a proper position to capture the scene (center of the room would be great).
  2. Open IceCool/Tools folder and select the render target (RT_SceneCapture).
  3. Click right on the RT_SceneCapture and select “create static texture” newly created texture is ready to use.
  4. Set reflection texture cubemap to the material parameter called ReflectionTexture.

Signed Distance Fields rendering

Signed Distance Field is an image where each pixel contains the distance to the nearest point on the boundary. An additional sign of distance allows determining if the pixel is inside or outside rendered shape. SDF image that looks like a gradient can be loaded from a file or generated by mathematical functions called Signed Distance Functions.

Short description of how to create the SDF texture:

  1. Place the BP_SignedDistanceFiled on map.
  2. Setup parameters.
    Source Texture The texture that will be used for generating the SDF.
    Search Width Width of search the distance in pixels. W*W is the number of iterations per pixel. Big numbers can crash the program.
    Texture Size Resolution of the texture.
  3. Click generate to update the render target.
  4. Click right on the RT_SceneCapture and select “create static texture” newly created texture is ready to use.
  5. Use the texture as CracksSDFTexture parameter.