Imaginary Blend Refund Policy – Terms and Conditions
Our refund policy is designed to ensure fairness for all customers and creators in the marketplace. Refunds are typically granted for faulty products or those that do not function as intended.
Due to a rising number of fraudulent attempts to demand refunds immediately after purchase without particular reason, I find it necessary to implement a refund policy to clearly outline the guidelines.
Refunds are applicable under the following conditions:
A bug within the Marketplace item prevents you from using it as advertised.
The description, screenshots, or video in the Marketplace don’t accurately reflect the content of the Marketplace item.
By making a purchase, you acknowledge the product limitations, especially the following:
The product’s documentation outlines its constraints, and it is essential to review them before making a purchase.
Product demos showcase typical use cases for which they were created and thoroughly tested.
If uncertain about the product’s capabilities, inquire and ensure that it meets your expectations.
Make sure that the product supports the Unreal Engine version that you use.
The purchase pertains to the current version of the product, not future planned changes. If a required functionality is not present in the current product version consider holding with the purchase.
The creator does not provide developing your product through buying the asset.
The refund will not be approved in the following cases:
“I thought it would work differently”
Read the documentation and listed limitations before purchasing. In case of ambiguity, inquire on the product page to ensure your expectations can be met.
“I wanted it to be more realistic” The creator is providing the executable demo that presents the possibilities of the product. Test it before the purchase.
“I didn’t see that it won’t work on Mobile and VR, PathTracing, Ray Tracing, Lumen, Sequencer”
You have to check for information in the description and ask especially for specific technologies or design assumptions.
“I’m a beginner; I don’t know how to use the engine; nothing works; I need tutorials”
The product is intended for experienced users of the Unreal Engine. Support questions will be answered, but the creator is not obligated to teach how the engine works.
“I wanted to create an open-world multiplayer, but it’s impossible.”
Creating open-world games is challenging due to the extensive design and development required to craft a seamless and immersive environment. The requirements cannot be precisely defined. By deciding on this genre, you implicitly accept the difficulties associated with the problems you will encounter.
“The product was expensive, and I need money now”
Please do not act impulsive, make wise decisions before purchasing the product.
“I did not read the refund policy before the purchase” Being unaware of the law is not considered an excuse.
Only Epic can issue refunds for Marketplace items.
Before requesting a refund contact support through the email address email@example.com or the “Questions” section on the marketplace to get help.
The seller has the right to request verification of your refund history to determine whether it is fraudulent or not.
In case of the situation when the marketplace support team decides to leave the decision about a refund to the seller the “Imaginary Blend Refund Policy” applies.
In case of issues with downloading the product
Ensure the product supports the engine version for which you are trying to download it.
Restart the Marketplace launcher to refresh products on your list.
Contact marketplace support to resolve the issue.
Fluid Flux continues development and brings even more useful tools to users. The 2.0 release delivers further improvements to facilitate a wide feature set robust, workflow-friendly, and versatile while keeping the needs of different sectors in mind.
A dark menace consumes the Old West. In solo or coop, fight with style in visceral, explosive combat against bloodthirsty monstrosities.
I am happy to introduce the first update of the Fluid Flux system. During development, I was focused on improving tools and fixing bugs. All changes are listed below. Continue reading Fluid Flux Update 1.1
The Fluid Flux is a powerful water system based on 2D shallow-water fluid simulations.
Realtime shallow water simulation – fluid data modifiers, wave generator, and extendable interface
Fluid surface rendering – caustics, wetness, underwater, waterline, advected foam, advected waves, blending with the ocean, dynamic audio detection
Fluid Interaction – simple cheap ripple solver moving with character, optimized to an absolute minimum
Ocean wave blending – rendering tillable ocean heightmap texture in a single pass
Clean, efficient, GPU-friendly implementation, interface designed with the KISS (Keep It Simple, Stupid) rule in mind
Tool for generating ultra-fast static meshes with flow maps baked into vertex color.
Advanced fluid state management, loading state in gameplay.
Niagara fluid async readback system for sampling height and flow of fluid in blueprints.
Dynamic audio analyzer. The sound source is positioned based on fluid movement.
Four example maps – beach, island, river, and baked static river
Velocity-based fluid flow advection method for foam caustics and waves
With great power comes great responsibility. I am committed to being a reliable marketplace creator, so it is important for me to clearly communicate the limitations and disadvantages of using simulations. The Fluid Flux system is somewhat overhyped, so please read the description below and ensure that everything aligns with your expectations and requirements before making a purchase. Ensuring that is crucial for the Refund Policy:
Overall, I cannot address every conceivable water issue in every type of project. While Fluid Flux represents a significant advancement for the game industry, it also introduces additional challenges. Controlling and adjusting simulations can be demanding; sometimes, a single parameter change can alter the fluid flow across an entire map.
The Fluid Flux simulation is based on the Shallow Water Equations (SWE) solver, the algorithm was published by Matthias Müller in “Real-time Simulation of Large Bodies of Water with Small Scale Details”. Simulation is calculated on heightfield mesh which means all obstacles are rendered to heightmap using top-down projection. Fluid can not be simulated in caves however, it is possible to make the algorithm ignore certain objects, such as bridges, so they do not interfere with the simulation.
Scalability is a real problem. The simulation requires allocating floating-point render targets so the maximum recommended resolution of texture is 1024×1024. That means if 1 pixel represents 100 cm in real-world (very low quality) then your simulation area can cover approximately 1km x 1km square with low detailed water. It’s not that much but still useful. The simulation frame can be baked to static mesh and used as mesh placed on the level.
Efficiency may be a problem. I recommend avoiding the use of resolutions larger than 1024×1024 in games. The calculation of a large-scale simulation often requires a substantial number of computations. As a result, I suggest utilizing the lowest feasible resolution for the simulation. For example in the example demos I provide, the maximum resolution employed is 1024×256 (BeachMap).
Multiplayer is not supported for dynamic simulation because synchronization of render targets is limited. However, the system can be used in a multiplayer game as long as your gameplay is not dependent on fluid simulation results.
Statically generated LOD mesh can be inconsistent with waterline post-process this issue will be addressed in the future.
Water Plugin is not supported in the current version. It is possible to use my water material with the water plugin mesh but it is not officially supported.
Niagara fluid readback used for buoyancy and fluid detection returns results with (at least) one frame delay. It’s good enough for most features like swimming fluid detection etc. but not 100% reliable. Trace hits with dynamic fluids are not supported now.
My support time is very limited because I am also working on the updates that will improve the quality and efficiency.
The system is enormously advanced and complicated so it may be overwhelming for beginners.
The documentation may be incomplete in some areas. I was pushed to release as soon as possible and could not find enough time to prepare many simple examples. The Fluid Flux system is clean and elegant but blueprint reading skill and example code analysis is required to use.
The simulation area can’t be rotated. The product supports only axis-aligned rectangular volume.
The simulation area can’t be moved in runtime. Movable volume is one of the most important features for future updates.
The simulation does not support the wave break effect because this approximation of fluid does not contain the needed data to render it.
Underwater glass, holes, and the submarine view are not supported yet.
The weakest point of this product is its documentation. I may not excel at explaining things, but in case of any issues, I will make an effort to resolve them for you. Please don’t hesitate to ask.
Good and bad practices
Try to avoid using huge simulation resolution because you will run out of memory very fast, 1024×1024 seems like a good compromise for now.
Don’t make your gameplay rely on fluid simulation in multiplayer because it can’t be synchronized.
Avoid using Fluid Flux on flat surfaces, subtle slopes are always better.
Avoid hard-edged geometry (boxes) it can be approximated wrongly in height maps and large slopes sometimes look terrible
Try not to overestimate this system. If I haven’t prepared the ocean with ships and a huge island covered by rivers and simulated lakes, there is probably a reason for that.
Try to not update the simulation ground in every frame it may cost you a lot of performance.
Meshes using the materials with “PixelDepthOffset” active may not render properly into a ground height map. There is a simple workaround described in the chapter “Ground Capture->Solving issues”.
Questions & Answers
Unreal Engine 5
The Fluid Flux was originally developed and tested on Unreal Engine 4.26. While it can be used with the latest version of Unreal Engine 5, please note that Unreal Engine 5 is still in the early stage of development. Especially new features like Lumen may not be supported properly.
As a result, I cannot provide support for engine-related bugs or address differences between UE4 and UE5. To ensure you are aware of the most common issues that may arise, please refer to the ‘Known Issues’ chapter.
Why is the price so high? Can you offer any promotions or discounts?
Creating a Fluid Flux required over 20+ months of research and development. Typically, a system of this nature would cost the company at least $200,000. The current price is making it a significant time and cost-saving solution for everyone.
The price of the Fluid Flux pack includes a subscription for all future updates and my support. This kind of system requires a lot of support, especially with ongoing engine updates. Since I have other products on my marketplace profile, my time is currently limited. I already quit my regular job to fully focus on developing and improving products, and I may need to hire someone to help me manage everything.
The asset combines several significant features that have been designed to seamlessly work together, making your development process easier. You no longer need to purchase multiple separate products and spend time integrating them. It includes a ripple solver simulation, shallow water fluid simulation, water surface rendering, ocean wave generator, buoyancy support, waterline effects, post-processing capabilities, coastline rendering, and even support for Niagara particles. I have taken care of all these aspects so that you don’t have to worry about them.
Please check mymarketplace profile to confirm that I am a trustworthy creator with hundreds of positive reviews. The price of my products is always determined based on the actual work and time required to develop them. Eventually, you will come to realize that time is a genuine currency with significant value in this world.
There will be a promotional sale but first I need to feel that I am ready to support more customers. If you are not in a rush then just add my product to the wishlist and please be patient. If you want to be notified a few days earlier about a planned promotional sale then join my Discord server.
I’m not forcing anyone to buy my product. There are alternatives like Niagara Fluids and Water Plugin, you can try them for free and compare results with my demo, maybe you don’t even need the Fluid Flux. Choose wisely.
Can I use it in my open-world game?
The true open-world setup has not been tested yet. The demo examples are showing how the system can be used. I can’t promise anything more at this point of development. Just assume that you get what you see in my videos no more no less.
Does it support multiplayer/replication?
Unfortunately, multiplayer is not supported. The Fluid Flux can be used in multiplayer games only as a visual addition and should not affect gameplay. There are many reasons:
Technical limitations. Fluid Flux uses Niagara readback for sampling wave data (it’s the only way to read from GPU). Niagara is not working on the server.
Bandwidth limitations. There is too much data changing dynamically so simulation can’t be synchronized over the internet. Everything would break very fast without correcting the differences between a server and a client.
Human limitations. I am not a multiplayer programmer so it’s even harder for me to jump into it and find reasonable workarounds.
External packs integrations ALS/UDS/Fluid Ninja I am planning to work on some integrations in the future. If you have some specific pack in mind let me know.
Can I use it with Voxel Plugin?
Yes, you can use Fluid Flux with Voxel Plugin but it is also limited to heightfield projection so caves and planets are not supported.
Use Simulation.RuntimeCaptureDelay = 1.0 to force simulation to wait for the voxel plugin map to generate mesh before rendering it to the height map.
The heightmap in this presentation is updated every frame using CaptureGroundHeightmap. It is not very efficient and should be probably optimized to only update the area around the brush when something changes on the map. The UpdateGroundMap(Position, Size) function would be a better choice in this case.
What about VR and mobile?
The VR devices and mobile are currently not supported by me.
Dynamic fluid simulations with high precision can’t be calculated on mobile so only baked meshes/states can be used on this platform. I am planning to add VR and mobile support in future updates. The pack will receive a cheap surface material mode similar to the solution presented in Aquatic Surface. One user notified me that my demo maps are working without issues on Oculus Quest 2 but I did not test it yet so I can’t confirm it.
How to add swimming to ALS? Can you show me?
This task is on your side. The Fluid Flux BP_FluxDataComponent) can give you all the data that you need to implement swimming. I’ve also prepared an example swimming implementation (BP_FluxSwimmingComponent) for testing.
Should I switch to Fluid Flux? Is it better than WaterPlugin/Oceanology or Aquatic Surface?
It depends on your specific requirements. Fluid Flux excels at creating dynamic river simulations on heightfields and interactive scenes, but it doesn’t scale very well. If you only need a background ocean without detailed simulations, then you might not want to go through the trouble of using simulations at all. However, Fluid Flux 2.0 introduces a new coastline domain system that can handle landscapes of up to 10km x 10km and provides an infinite ocean. I recommend playing the demo of Fluid Flux to get a better sense of whether it aligns with what you’re looking for.
Why blueprints? Is Fluid Flux slow because of that?
This system is fully implemented in blueprints but relies mainly on GPU (shaders/render targets). Yes might be faster in C++ but it would also stretch out the development time. The cost of blueprint code will be reduced by Niagara implementations in the future so it will work even better.
Is Fluid Flux calculated deterministically?
If you simulate on the same machine it is deterministic (constant delta time is used) but there are probably differences between floating-point operations on different GPUs so it can’t be fully deterministic. In the case of multiplayer games, the biggest problem is the synchronization of state when someone joins the game and when the player loses some frames because of spikes and the simulation can’t work it off – I’ve decided to limit the number of accumulated iterations per frame in this case.
Do I need this pack? Other water systems like your Aquatic Surface can do the same.
Well, if you can’t see the difference then probably you don’t need the Fluid Flux. I’ve prepared a simple comparison of Fluid Flux and Aquatic Surface that visualizes the difference:
That does not mean Aquatic Surface is bad. It’s a great product with a completely different feature list. You have to choose a product that fits your requirements.
Is it efficient?
It depends on GPU, resolution, etc. You can download the demo and try it yourself there is an FPS counter on the screen.
GTX 860M in 8 years old notebook Lenovo Y50
What about the future of this project?
I have plans for numerous updates but I don’t have a specific timeframe as of now. It’s crucial to make your purchase decisions based on the product demonstrated in the demo, rather than relying on promises. You can keep track of the current progress of improvements at Trello.
Scalability – Large-scale simulations locally updated with the moving areas are in my plans.
Niagara fluids – The system will be rewritten to Niagara fluids once it is stable and ready for the transition.
Audio detection – the current system uses an ambient audio source and a simple dynamic source I am planning to improve to three dynamic sources of audio that work at a different distance.
Simulation volume – Moveable simulation area and the ability to work with multiple simulations simultaneously.
Alternative material modes
Meshes with dithering enabled can not be cached by ground capture. There is a workaround for this problem presented in the M_Photoscan_Master material. More info in chapters related to ground capture.
It seems like construction scripts are working differently when the level is loading during the editor starts in UE5.1.Render targets are not initializing properly because of that. If you see that something is not loaded then just reopen the map/restart the simulation and everything will work fine again – sorry for that.
Plants will not load properly on the map if you are not in “Real-Time” render mode in the editor. Its limitation of blueprints I don’t have any event that could execute after the loading and adjust Niagara effects.
“Real-Time” render mode is required to simulate in the editor. Otherwise, the view will not refresh after each frame.
If you downloaded Fluid Flux 2.0 for UE5.1 and later updated the engine to 5.2, it is essential to update Fluid Flux as well. This is because certain engine features have been deprecated between those versions.
This section covers the fundamentals of Fluid Flux and its tools. If you are new to Unreal Engine, you should become familiar with the Unreal Editor interface, Blueprint visual scripting, and the types of content. Working according to the documentation can provide the best possible experience with this product.
The effective use of the Fluid Flux pack requires activating specific built-in engine plugins and integrating TPP input configuration for testing the example character implementation from the demo. As a marketplace creator, I cannot incorporate these changes into the asset pack, so developers need to implement them in their projects.
I recommend using the template example project for your initial trials with Fluid Flux. The easiest way to run the demo is described below:
Open and run the FluidFlux.uproject file (it will automatically register the FluidFlux project in the launcher projects list). The template project is prepared for 4.26 but can be converted to any higher version including UE5.
Download the Fluid Flux asset pack to the template project using the marketplace launcher.
Working with an external project is a bit more challenging because every project may use some specific configuration characters and use a different version or feature set of the engine so requires configuring everything by hand.
Below you can find a list of plugins that are required by Fluid Flux:
Editor Scripting Utilities
Procedural mesh Component
The first place to visit in the demo folder is Demo/Maps. This folder contains all example maps that are working perfectly with the TPP example character (BP_DemoCharacter). If your project uses another character template then probably input should be changed to make it work properly.
You can also set it up by hand in project settings or download it from this link: DefaultInput and copy it into your Configs/DefaultInput.ini file.
In the final stage of configuring your project make sure that:
DBuffer is enabled in your project settings because it’s required for proper decal (wetness and caustics) rendering. If you don’t want to use those effects then you can disable decal material in the surface actor.
CustomDepth-Stencil Pass is enabled in your project setting. This feature is required for proper underwater masking.
Strta/Substrate materials should be disabled in your project. This feature is still experimental and not supported by Fluid Flux.
Real-time rendering is active in your viewport it is required for simulating fluid in the editor.
The Fluid Flux project is organized in a certain way, this short description will bring you closer to what kind of content can be found in each folder.
Demo – The demo is the most important folder for new users. The demo examples present how to use this pack, how effects can be achieved, and how to integrate systems with characters. Everything that can’t be found in this documentation probably will be presented in a very clean way in the demo folder.
Simulation – Shallow water simulation actor and tools for controlling fluids and generating state.
Coastline – Generating coastline data for oceans.
Interaction – A simple system of interactions that adds detailed lightweight ripple fluid simulations.
Surface – Renders surface, underwater volume, post-process, caustics, and playing audio.
Environment – Niagara particle systems that allow readback pieces of information from simulation implementing swimming, buoyancy, and drive particles using fluid state.
Waves – a system designed for generating ocean waves that can be used in the background and mixed with fluid simulation. Before starting work with the pack it’s worth being familiar with the structure of the pack and the systems that it provides.
The latest version of the product is detailed on the marketplace page and at the beginning of the documentation. Frequent updates for Fluid Flux are planned, so always ensure that the Fluid Flux package version is up to date. However, be mindful of potential issues when updating.
Changes that can be made during updates:
adding features and examples
improving quality and exposing parameters
I am always trying to minimize the update damage however it is not an easy task when an update system on the marketplace supports only adding or replacing files. There are a few basic rules worth noticing before updating:
It’s better to make some backup first. Copy your version of the pack before doing the update,
Remove the pack from the project before updating it. This way you will get a clean, and fresh version of the pack without any ghost files.
Do not modify the pack on your own. If you need modifications then you can inherit classes/materials and override functions. Store child classes and custom material instances outside the Fluid Flux folder.
If you need some small trivial modification that could improve usability then you can let me know we will figure out some solution maybe put it into the next update.
The code that doesn’t exist is the code you don’t need to debug. I am trying to do my best and solve all possible bugs or find good workarounds but there is always something that can be broken it’s like a typical lifetime of applications nowadays.
Simulations can be unpredictable some edge cases are probably still not handled. In case of finding some specific bug, you can report it to me and I will take a look at it in a few days. Before sending the report:
make sure that you use the newest version of my pack.
prepare a detailed explanation of the repro steps needed to recreate your bug, and make sure that the explanation is as clear as possible.
create a minimal example project that can show me the bug (it will increase the chance of fixing it)
attach information about the engine version you use, and your development platform.
prepare a video or screenshots presenting the problem and reproduction steps.
Ground capture is a fundamental aspect of Fluid Flux systems, as the fluid requires a designated vessel. The recognition of water blockers is achieved by rendering the scene into a height map using a top-down projection. A height map also referred to as a ground map, is a specialized texture that stores the height information of the geometry in pixels.
In the Fluid Flux pack this task is handled by a component FluxHeightmapComponent that is used in both the simulation domain and the coastline domain. The generated height data play a pivotal role in determining the appearance and movement of water on slopes.
The ground scene capture component renders the current scene into a heightmap texture. Sometimes, we need to exclude certain actors, such as bridges, that should not act as fluid blockers. In the ‘Domain:Heightmap: CaptureVisibility‘ tab, you can find options that may help determine which meshes should be visible in the heightmap:
VisibilityMode can be used to determine whether the capture actor should render the entire scene and exclude specified actors (HideAllListed) or just build a list of actors that should be used as blockers (ShowOnlyListed).”
ActorsReferences – exclude chosen actor objects on your scene
Actos of Class – can be used for excluding by type of actor class and its child classes
ActorsWithTag = “FluxHide” that every actor that uses this tag will be added to the list of excluded objects.
Use M_PhantomMesh material to render mesh ONLY during processing ground (invisible in the world) good for imitating soft slopes of waterfalls or just creating an alternative version of ground when the environment is more complicated (like indoor meshes).
If you notice that some objects should not be rendered to heightmap you can exclude them and if something is not rendering you can start investigating what is going on without simulating the fluid.
In case of any problems bugs or unexpected behavior, the debug preview option is the most important feature. It shows a height map that will be used for blocking the water on the scene.
It’s noticeable that not every detail will be captured in-ground map and the accuracy depends on the simulation resolution scale. The shader used for rendering this preview is coloring lines based on slope and exposes the discontinuity of the ground.
Green lines– easy to simulate very stable fluid
Blue lines– Good slope for moving fluid in some direction
Red lines– can cause some instability if fluid flows on it but good for walls
White lines– there may be some cave that can expose holes in the fluid mesh.
Fully red polygons represent holes and discontinuity in the mesh. It should be eliminated if fluid can flow into this area.
Landscape or mesh is not rendering to the ground map
If the “PixelDepthOffset” feature is used in the material then it may be not rendered to a ground map and cause some simulation problems. This problem can be easily fixed in your material by simple modification. Use the MF_FluxPixelDepthOffset material node that will automatically disable “PixelDepthOffset” while caching the ground map.
The M_Photoscan_Master material is an example of a workaround for this problem with the use of the MF_FluxPixelDepthOffset node.
The BP_FluxSurface is an abstract actor responsible for the general implementation of an audiovisual representation of the simulation and coastline domain data.
An abstract actor is like a guide for creating new actors. It defines some common features that all related classes should have, but it’s not complete on its own. Other classes need to finish the details. For example, an abstract class may not specify any ambient water audio, but a customized ocean template has sounds that match the type of water it represents.
The surface actor implements a list of advanced subsystems:
surface wetness decal
surface detail advection
Generating procedural static mesh
Analyzing fluid surface for audio source
Volume Absorption, Volume Scattering
Rendering underwater fog and absorption volume
The BP_FluxSurface is an abstract base parent class which means it can’t be placed directly on a level. Configuring all these subsystems can be time-consuming, particularly for new users. This is why the package includes basic configured surface child actors templates that can be found in the FluidFlux/Surface/Templates folder.
Default water surface, spawned dynamically when the domain.SurfaceActorReference is not specified.
Simple river/waterfall material and basic underwater post-process, reiver audio
Advanced ocean materials, ocean audio, materials blended with wave actor, underwater mesh, underwater scattering, and absorption volume.
Rendering two domains simulation and coastline at the same time. Useful or large oceans and islands.
Users can also implement their surface templates and extend the functionalities of the surface as well.
The BP_FluxInteractionCapture is a system designed for adding efficient detailed interaction simulated in small areas around the camera (or specified object).
It is a perfect addition that can improve the quality of baked simulation almost for free.
It is currently supporting the simplest fast ripple solver (but there are plans to simulate a fluid pressure solver in the future)
A demonstration of the interaction system configuration is highlighted in the BP_DemoCharacter. This blueprint illustrates the application of components for enabling character interactions with water. A detailed explanation of the implementation can be found below:
BP_FluxDataComponent – This component reads the fluid data like height and velocity that are needed during further calculations of interaction. You can find more info about this component and configuration in the section Async readback.
BP_FluxInteractionComponent – This component stores a list of interaction sources. The interaction source is a sphere attached to a component (or skeletal mesh bone) that will generate waves after interacting with fluid.
Interaction sources are attached to skeletal mesh with the tag specified in the OwnerComponentTag attribute so it’s important to add this tag (FluxInteractionOwner) to skeletal mesh that will move the interaction sources.
The BPI_FluxInteraction is the interface that takes care of communication between the actor and the interaction capture system. It should be added in the Class Settings of the actor that will interact.
Implement the GetInteractions in BPI_FluxInteraction. When BP_FluxInteractionCapture is an overlapping interactive actor then it’s calling this GetInteractions function to find pieces of information about interactions that occurred.
There are many steps to do before interaction start working but no worries you can debug and make sure that every part of your implementation works correctly:
1. Test it on demo maps at first
2. Enable DrawDebug in BP_FluxInteractionCapture
3. Enable DebugDraw in BP_FluxDataComponent
4. Enable DebugDraw in BP_FluxInteractionComponent
5. Make sure that “GetInteractions” is evaluated. Put a breakpoint in this function to check.
Interactions with a fluid surface may generate splashes, which add realism and immersion to the simulation. This is achieved through a straightforward mechanism that detects when the interaction source intersects the fluid surface. If this condition is met, a splash effect is created to simulate the disturbance caused by the interaction.
The InteractionComponent->Source->SplashType defines the index of splash that is indexed is stored in the Surface->InteractionSplashes array.
World painter and brush
The FluxWorldBrush is a unique actor designed to manipulate data stored in the WorldPainterComponent canvas. The system offers three distinct types of canvas on which the FluxWorldBrush can perform painting operations:
ColorPainter ( BP_FluxWorldPainter), interpolation between the base color and painter color.
WaveSize (BP_FluxCoastlineDomain), interpolation between no wave and highest wave.
WaveType (BP_FluxWorldDomain), interpolation between Oceanic wave and coastline wave
This two-minute tutorial presents the workflow and possibilities of brushes:
Reading fluid data
The Fluid Flux uses Niagara asynchronous readback events to read data from fluid render targets and pass them to blueprints. The BP_FluxDataComponent is a listener that can receive, update, and store fluid data at a certain location.
The Fluid Flux uses this feature in multiple situations:
buoyancy and floating objects
automatic dam breaking
fluid sound source analyzer
underwater camera detection
Add BP_FluxDataComponent to your actor and it will automatically detect the fluid surface under the actor. BP_FluxRotatorActor is a good simple example of an actor that can react to fluid.
The Readback data component can be attached to the scene component in the owner actor by adding a tag to it “FluxReadbackOwner“.
Use Debug option in FluxDataComponent to make sure that component is following your actor and sampling proper values.
The Niagara integration is based on three elements:
BP_FluxNiagaraActor communicates with the simulation and data to the Niagara system.
NE_FluxData emitter should be inherited by the Niagara system to read the data.
NMS_FluxData special module that extracts simulation data to Stage Transients variables that can be used to drive the particles.
All particle systems (trash, plants, splashes) in the pack are constructed the same way, feel free to check the examples and modify them.
The BP_FluxSimulationDomain blueprint is the heart of the Fluid Flux system. This blueprint is responsible for handling important tasks like:
Rendering of the ground height map to texture.
Updating simulation of shallow water fluid, foam, and wetness.
Baking and exporting simulation state.
Sending data to the fluid surface renderer.
The Shallow Water simulation is based on the idea of assuming linear vertical pressure profiles so it is simulated in two dimensions. In general, the algorithm can be described in a few steps:
Simulation data is stored on 2D render targets. Ground map – information about landscape and obstacles Velocity (RG) Depth(B) Foam(A) map – stores information about fluid Height(R) Wetness(G) map – stores surface height and wetness of the surface
The slope of the ground heightfield and the slope of fluid are combined and used for calculating the pressure and velocity.
Simulation is an interactive process of updating fluid height and velocity.
The result of integration is used for foam and velocity advection.
Fluid modifiers can be used as input for simulation to change the current state.
The simulation frame is used for accumulating fluid wetness and generating the fluid surface mesh displacement.
The BP_FluxSimulationDomain actor placed on the map is doing nothing because it’s empty and needs to be filled with fluid. The simplest way to fill containers with fluid is by using a Modifier source actor.
The BP_FluxSimulationDomain blueprint comes with a simple editor that allows simulating the state of the fluid on the map in the editor mode. Simulation can be prepared in the editor baked to state or dynamically updated by the simulation blueprint.
A modifier component is a powerful tool that affects the simulation state and changes the current simulation state. It’s the simplest way to interact with the simulation. Modifiers can be used for:
Adding/removing fluid in the simulation domain.
Changing the velocity and flow direction of the fluid
Simulating custom interaction with fluid
The tutorial below presents how to add a source modifier on the scene and fill the simulation domain with fluid:
The Fluid Flux contains predefined modifiers:
BP_FluxModifierComponent – Base parent class for all fluid modifiers. Every modifier component extends it and implements specific behavior. Users can create custom modifier classes and materials for specific use cases like whirlpools/waves. Describing the architecture is outside of the documentation scope, the BP_FluxModifierSourceComponent component is a good example of a modifier that can be used as a starting point for learning how the system is designed.
BP_FluxModifierSourceComponent – Simple modifier that allows adding/removing fluid and changing velocity in a specific area.
The quantity of fluid applied by modifier (depending on Mode).
The velocity of fluid applied by the modifier (depending on Mode).
The shape of the fluid modifier
Defines how the modifier is applied to the simulation. All modifiers are rendered to a simulation buffer. Mode is a method of blending used during this process. Add – This mode will add the ‘volume’ of fluid and its velocity to the current fluid on the map. Adjust – adjusting the current state of fluid to the height of the modifier. Set – will set the constant height of fluid in the area.
The softness/hardness of the modifier edge. Useful for soft mixing with fluid or uniform filling the vessel.
Scale the effect of the modifier on the domain (0.0, 1.0>.
Defines how long the fluid source will be active. (less than 0 means infinite), (0 Rendered in a single frame and then modifier will be disabled), (more than 0 is the duration in seconds)
Affecting an order of modifiers in the queue. It may be useful when the user needs to fill the domain and then remove some fluid. A value larger than 1000 forces the system to add on the end without sorting.
The modifier can be inactive at the beginning and activated by an event from gameplay or a sequencer.
There are many ways to remove fluid in the simulation domain. It depends on the mode you choose:
Mode= Set, Volume=0 – will set 0 water in the area of the modifier
Mode= Adjust, Actor.Z position – lock fluid at a specific height
Mode= Add, Volume negative – slowly removes fluid
BP_FluxModifierGerstnerComponent – Generates waves/fluid on borders of the simulation domain. More info and use cases can be found in the chapters related to ocean configuration.
BP_FluxModifierForceComponent – Generates forces when the actor is moving (based on the velocity). Can be added to the character.
The modifier container is a special type of actor that can store multiple modifiers and send them to the simulation. If the actor implements the BPI_FluxModifierContainer interface then it’s considered a modifier container.
BPI_FluxModifierContainer interface can be implemented by any actor. A good example is BP_DemoCharacter that uses BP_FluxModifierForceComponent to interact with a fluid. AddModiffiers function only needs to be implemented to make it work.
BP_FluxModifierContainerActor is a basic container that can combine multiple modifier components in a single actor that works simultaneously.
BP_FluxModifierSourceActor is a specific type of container actor that simplifies the process of adding simple source modifiers to the scene.
Every modifier is rendered in an additional render target pass, so it’s not cheap. It’s good practice to minimize the number of fluid modifiers inside the simulation to achieve the best possible performance.
Unexpected spikes sometimes destroy the simulation. Shallow water simulation can exhibit unpredictable behavior due to the loss of stability when encountering steep slopes. This is a widely recognized problem that can be resolved by fine-tuning specific parameters and enhancing the reliability of the simulation.
Slope Scale (Increase)
Increasing this parameter will scale down the height of the scene on the Z-axis and make your fluid move slower on slopes. (This parameter will be probably redesigned in the future version for better consistency with the world scale measurements).
OvershootineBlend(increase), OvershootingEdge(Decrease) Fixes overshooting problem bysmootcheening
Simulation Delta Time (Decrease)
Fluid flux is updating with constant delay time. That means the time that elapsed between two frames of the game is divided by delta time and the result is the number of iterations. In general, more iterations mean less performance. By default, the Simulation Delta Time attribute is set to 0.2, it is not safe in terms of accuracy but can work with the highest performance.
Debug Preview / Debug Hidden In-Game (Debug)
Pay attention to the ground debug preview. If you will spot some red spikes in places where the simulation does not work properly then probably the problem of your ground texture and geometry should be adjusted.
Try to tweak other variables like Slope Clamp, Velocity Clamp, Friction, Damping, Gravity, and even World Pixel Scale can make a difference.
I am still experimenting with different solutions for this spikes problem and trying to figure out some method to force stability even in bad conditions. You can expect multiple improvements in future updates.
The PDA_FluxSimulationState is a special data asset created especially for storing the current frame of the simulation. The simulation state is the most important structure in the system.
Simulation states are dynamically updated by the simulation actor. A dynamic simulation state is automatically created in the constructor script and stored in BP_FluxSimulationDomain.CurrentState actor. Data can be easily previewed and used for rendering and further fluid analysis.
The simulation state can be exported to the data asset and used later in many ways.
Can be loaded in runtime
Can be used as a starting point for the simulation
Can be rendered by the surface actor
Can be used in other gameplay tools in Niagara emitters and materials
BP_FluxSimulationDomain.CurrentState can be baked to an asset by clicking the right button on BP_FluxSimulationDomain and choosing Scripted Actions -> Flux Export Simulation. The short tutorial below presents the process of generating and exporting the fluid state to a data asset:
(0:07) Prepare data assets that will store the state
(0:20) Start the simulation of fluid on your map and stop when it’s ready
(0.54) Select the newly created state as a target and adjust settings.
After those actions, you should see the generated state and additional textures
Pay attention to simulation resolution! The “Power of Two” rule is a fundamental necessity due to the way game engines work. You will not be able to export simulation state if it is size is not power of two (128/256/512/1024 etc.)
Use case 1: Initial state
Using the state as the initial state of simulation allows to start the simulation from a specific frame. It means that a simulation of flowing water does not have to be generated every time the game starts it can start from the saved frame.
Find InisialState attribute
Set the state exported before
Since now simulation will start by reading data from the state.
Use case 2: Loading state in gameplay
Sometimes there is a need to load state during the gameplay in blueprints like a checkpoint. Simulation->LoadInitialState function can be used to handle these tasks. Example use presented on the screenshot below:
Use case 3: Using state without simulation
If you are not planning to update the simulation dynamically then a better option is removing (or disabling) the SimulationDomain actor and using SimulationState directly in the Surface actor. It may be way faster, more reliable, and save some memory. Use this configuration:
Click the right mouse button on SimulationDomian and expert it to state
Set SimulationDomain->AffectWorld = false
Set Surface->SimulationState directly using the exported asset.
BP_FluxSimulationDomain.SurfaceActorReference is set then actors are communicating together and the state is automatically passed to BP_FluxSurface.SimulationState but changing the AffectWorld=false automatically disabling this process.
SimulationDomain can be removed or set IsEditorOnlyActor to avoid spanning it in the final build and wasting memory.
The Fluid Flux uses a static ground map for simulating fluids on it but updating is also allowed from time to time. It’s not a very cheap operation so it should not be done every frame.
The BP_BreakableDam presented in the video can be destroyed when the player hits the trigger. It is a good example of updating the ground after the dynamic object. Take a look at the Break function which is the heart of the dam system.
It evaluates UpdatGroundMap on a simulation actor, which takes the position and the size of the object that will disappear to recreate the ground in this area. You can do the same operation when adding something on your level.
In the tutorial below, we will go through a useful workflow for generating static meshes based on a baked state of the simulation. With this method you can achieve the highest performance, unfortunately, the waterline effect will not correlate very well with static mesh geometry.
Setting up LOD and padding
Converting a generated mesh to a static mesh asset
Preparing material for static meshes
Using static mesh in Surface actor
Mesh generation tools can be found in the BP_FluxSurface – Procedural Mesh tab.
Switch GenerateProceduralMeshView = true to see a preview of the generated mesh. After choosing this option surface automatically generates static mesh based on the Procedural Mesh tab configuration.
The last step is selecting SurfaceProceduralMesh Component and clicking “Create StaticMesh” which will export the mesh to the static mesh asset.
If SurfaceProceduralMesh is not visible on the list of BP_FluidSurface components then you have to deselect the surface and select it again. It’s working like that because the engine does not refresh the list of components after switching on/off generation mode.
Now you can find your static mesh in the content browser and preview it.
The river is simulated on the map and then converted to static mesh. The velocity flow map still works, mesh can be directly used by the surface actor.
If you create a static mesh and try to save it, you might get an error saying it can’t be saved. This happens because the static mesh is using a material instance created dynamically in the blueprint. To fix it, just clear the material slot in the static mesh asset and then save the mesh.
The material configuration “MI_River_SurfaceOverStatic” is designed for static meshes and can handle fluid data such as foam and velocity encoded in vertex color. By setting “MI_River_SurfaceOverStatic.UseSimulation = false“, the system will read the data from the vertex color instead of sampling render targets. This flag allows you to use the static mesh data as a source of water information.
BP_FluxSurface can render static meshes using SurfaceMeshMode=StaticMesh which means the mesh will not be transformed by simulation state scale and geometry will be taken from SurfaceOverMesh component configuration.
Sequencer gives users the ability to create in-game cinematics through its specialized multi-track editor. Unreal Engine has many options that can be used for previewing scenes, but some of them are not supported by Fluid Flux:
Real-time – You can preview the simulation by hitting “play” (green button) in the editor. It will show you how it would behave in real-time.
Editor – Clicking “Start Simulation” in BP_FluxSimulation actor on the scene (as presented in tutorials) allows you to generate some specific frame and sawing int to state (an initial state that can be pinned to the simulation actor as the first frame).
Sequencer editor – Simulation can’t be previewed in the sequencer because the Fluid Flux system does not allow rewinding simulation in the current version and the sequencer affects actors – the topic is very complicated.
Capture Movie – Sequencer Render Movie to Video. It works the same as in real-time mode and Fluid Flux water will render properly in the final video.
Capture 360 – This option was never tested
If you want to preview the simulation with a sequencer then this setup may be useful for you:
Drag and drop the sequencer on your map.
Switch the AutoPlay option in LevelSequenceActor.
Press “Play in selected viewport” or “Simulate”
Fluid Flux provides an option to use modifier activation/deactivation functions from a sequencer. The process of using this functionality is really simple:
Add Sequencer on the level and set parameter: Autoplay=True
Add Source modifier on the level and set parameters Duration=-1, AutoActivate=false
Drag and drop the Source modifier to the sequencer.
Choose the moment of activation on the timeline and click “Add a new key” then change the properties of the event as presented on the screen below. You can choose the Activate/Deactivate function.
Now you can test the scene by rendering a video or playing the game. The modifier should activate automatically by sequencer event.
The BP_FluxCoastlineDomain is a dedicated system designed specifically for capturing and baking world data into CoastlineState. The architecture of the BP_FluxCoastlineDomain system is similar to the BP_FluxSimulationDomain, ensuring consistency and familiarity.
The coastline state serves as a fundamental data source during the rendering process for coastlines and oceans. The BP_FluxCoastlineDomain can communicate with the FluxSurface actor and feed it with data needed to render the water as presented in the short video tutorial below:
If after following the tutorial steps, the coastline does not appear, you have likely encountered one of the scenarios listed below:
Z location of the Coastline actor is wrong:
Remember that after adding the BP_FluxCoastlineDomain actor on your map you need to modify the height (Z position) of the domain actor depending on your landscape. Otherwise, you may not be able to see coastline waves.
The debug grid on top of the landscape geometry is a good indicator that everything is working fine.
Landscape material uses Pixel Depth Offset
If the “PixelDepthOffset” feature is used in the material then it may be not rendered to a ground map and cause some simulation problems. This problem can be easily fixed in your material by simple modification.
Use the MF_FluxPixelDepthOffset material node that will automatically disable “PixelDepthOffset” while caching the ground map.
The M_Photoscan_Master material is an example of a workaround for this problem with the use of the MF_FluxPixelDepthOffset node.
The BP_FluxCoastlineDomain actor generates a state that can be exported and saved as the data asset in the project files. It can be used as an initial state or directly in the BP_FluxSurfaceActor, just like the SimulationState. This simplifies the workflow and makes the system even more user-friendly.
Exporting the current coastline state can be described in a few simple steps:
Create an empty coastline data asset for storing the data.
Save to newly created data assets.
Exported CoastlineState data asset stores two textures.
WorldGroundMap stores the height map of the coastline area.
WorldCoastlineMap stores the distance to the coastline, direction, wave height, and blending of the coastline.
If you are not planning to update the coastline dynamically then maybe a better option would be removing (or disabling) the CoastlineDomain actor and using CoastlineState directly in the Surface actor. It may be way faster, more reliable, and save some memory.
Try this configuration:
Configure CoastlineDomain and export it to the state.
Set CoastlineDomain->AffectWorld = false
Set CoastlineDomain->IsEditorOnlyActor = true
Set Surface->CoastlinerState directly using the exported asset.
You don’t have to remove the domain actor from the scene because IsEditorOnlyActor will remove it automatically for you in final build. Thanks to this solution you are able to use it later again in the editor for regenerating the state if the level geometry change.
(More tutorials soon, thank you for your patience)
Preparing an ocean scene is an advanced topic because it requires multiple actors working together at the same time. It’s good to try all other tutorials before starting to work on recreating the ocean scene.
Examples of setup are presented in the demo maps FluxIslandMap and FluxBeachMap you can copy this setup (Surface, Simulation, WaveTexture, WaveModifier) or create it from scratch on your map. There is no video tutorial for ocean waves yet because I am planning a lot of improvements on this topic (infinity ocean surface already implemented) that will be published soon in version 1.2 of Fluid Flux. I have decided to make a tutoprial after the release.
Step by step
I’ve prepared a minimal MinimalOceanMap with all those parameters set according to this list and sending it to you in an attachment so you can check every actor and compare it with your config.
Start with an empty map with some island/landscape/ground mesh:
Add BP_FluxSimulationDomain actor
Add BP_FluxSurfaceOcean actor (make sure that you use Ocean actor)
Add BP_FluxOceanWave actor
Add BP_FluxModifierWaveActor actor
Configure Simulation BP_FluxSimulationDomain:
1. Pin SuraceActor to SiumationActor
SurfaceRenderMode = SurfaceReference
SuraceActorReference = BP_FluxSurfaceOcean
2. Adjust the transform of ModifierWave to cover the area of the simulation (change scale and position)
Configure Surface BP_FluxSurfaceOcean:
3. Set WaveTexture = BP_FluxOceanWave actor
4. Set WaveTextureStateAreaBlend = 4
5 Set SurfaceMeshTransform.Scale = (4,4,1)
6. Set SurfaceMeshMode = Plane_1024
(optional) Configure Distant meshes in BP_FluxSurfaceOcean:
7. Add SM_FluxPlane512x512 on level
8. Set SM_FluxPlane512x512 to BP_FluxSurfaceOcean.DistantMeshes(Actors). The Surface will automatically create a material instance SurfaceDistantMaterial and apply it to all distant meshes on the list.9. Adjust your distant mesh scale and transform it to cover the area around the ocean.
– WaveTextureActor texture actor added in the previous step.
– WaveTextureStateAreaBlend the hardness of blending between simulation and ocean wave
– WaveTextureStateAreaBorder edges of simulation that will be blended with wave texture
– Scale/Move the component to cover the simulation area
– Set SurfaceHeight to generate water at a certain height
– Set StateAreaBorders to select which borders should generate fluid and affect the simulation
The BP_FluxOceanWave is a dedicated actor for simulating ocean waves. Implementation is much simpler than the classical analytical approaches like Gertner or FFT. The system uses multiple tillable textures and combines them into a height map that can be used in the post-process/surface/Niagara system to represent wave displacement.
The fluid surface uses data from BP_FluxOceanWave placed on the map pinned to the BP_FluxSurfaceActor variable. The blending between simulations of ocean waves can be configured in special cases like on the beach map.
The beach example contains ocean waves blended with fluid simulation. All materials configurated for use with oceans are declared in the Surface/Templates/Ocean/ folder. Surface material uses the UseFluxOcean=true flag to activate ocean wave blending.
Ocean waves in the simulation are generated using the modifier BP_FluxModifierWaveActor which adds fluid and velocity at the borders of the simulation area.
Additional waves in the background with simplified material (SurfaceDistantMaterial) are attached to the surface in the “DistantMeshes“.
You can use RenderDebugPreview to see the analytical waves preview. This actor will not be visible it’s only there to generate the texture of ocean waves that the surface will use.
Change Z location to adjust the height of the ocean wave.
The Projection FOV is a user-friendly and configurable system designed for projecting textures onto a geometry. Useful for:
implementing area of visibility in 3D games
projecting colorful textures on the environment
simulating shadow map in toon shading systems
simplified fast cheap alternative for the spotlight
camera field of view control combined with depth test
supports two types of material rendering (mesh-based and decal based)
frustum area attachment debug
advanced coloring and texture mapping
multiple types of shape projection (rectangle, circle, masked)
can be attached to sockets and bones
configurable projection receivers
camera direction and game type independent
Basically, it’s good to start with an example demo that presets all features and a wide range of possibilities of implementing the system in the game.
The simplest way is to use Projection FOV is:
Drag and drop BP_ProjectionFOV on the map.
Adjust actor to your scene or attach it to actor using (AttachTo function)
Chose the material and set it to ProjectionMaterial attribute. You can use any type of material from templates ProjectionFOV/Materials/Templates
There are two types of projection that can be used with the Projection FOV system – decal projection and mesh projection. The type of projection is detected automatically based on the master material used in the blueprint. Both have some special features and some limitations that need to be addressed.
Implemented in master material M_ProjectionFOV_Decal
Can be only used with BP_ProjectionFOV actor because requires a decal component.
Can be filtered by “Receive decals” so you can disable the rendering effect on some meshes.
Uses translucent blending
Good for rendering area of sight
Implemented in master material M_ProjectionFOV_Mesh
Works pretty well with normal fading
Can be used on static meshes (SM_FrustumInside or SM_Light)
Uses multiplication blending
Good for simulating spotlight (car lights)
Basically, the Projection FOV comes with predefined material instances (ProjectionFOV/Materials/Templates) configurated for use in some specific cases.
Users can create their own materials as well by creating a copy or material instance from master materials M_ProjectionFOV_Decal/M_ProjectionFOV_Mesh.
Material parameters can be also modified dynamically during gameplay ProjectionInstance in ProjectionFOV.
Infinity Weather update 1.1 is ready now! A full list of changes is available on Trello
Solder is a short RTS style game concept placed in the futuristic science fiction world inside the computer on an imagined PC Board.
The tech demo is an introduction to basic RTS mechanics based on a parasite chip called SOLDER that evolves during the journey on PCB.
Infinity Weather is a powerful and clean system designed for weather control in Unreal Engine.
The package is a combination of 7 systems that could be divided into separate packages: wind, displacement, landscape, precipitation, footsteps, fog, post-process now available as a configurable unified system.
configurable displacement capture blueprint
top-down projection of displacements rendered using shape definition in the shader (sphere, capsule, box, cylinder, decal, trace-sphere)
skeletal mesh displacement supports
interface for easy integration with all types of actors
area of displacement move dynamically with the actor or camera
time-based accumulation of snow
build functions can be used for multiple other effects (like grass interaction)
displacement material functions ( snow, mud, sand, grass)
small world texture (1024×1024) can handle even 150x150m area
two materials with fast dynamic switching between material permutations
Make sure that the Infinity Weather package version is up to date. The current newest version is described at the top of the documentation.
The package requires a procedural mesh component to work so make sure that it’s enabled.
I’ve decided to make this project downloadable content that can be added to the project. Unreal Engine does not support sending config files in downloadable content so users have to add footsteps configuration in the project that uses pack to make it work properly. The process is really simple:
Open the project configuration Edit->Project Settings
Select tab Engine->Physics and scroll down
Add physical surface types: Snow, Sand, Mud, Rock
The package works perfectly with the TPP input config files. if you have used another template then probably you will have to add inputs into your project for package testing:
Every user of this package should start by checking all examples delivered with the product. Example maps can be found in the InfinityWeather/Demo/Maps folder.
There are available multiple examples that preset different configurations:
SnowMap – simple snow effects with lightweight landscape material.
RainMap – simple rain and mud effects with lightweight landscape material.
DesertMap – simple desert and sunny weather with lightweight landscape material.
WorldMap – advanced multi-layered material with mixed landscape effects.
The example character interaction with the Infinity Weather world is implemented in Demo/Mannequin/BP_DemoCharacter by four basic components added to it:
BP_InfinityFootstepComponent – spawns footsteps effect from notifiers
BP_InfinityDisplacementComponent – renders shapes for ground displacement
BP_InfinityPostProcessComponent – controls post-process and controller communication.
BP_InfinityPawnComponent – additional effects on a character like breath particle when it’s cold.
The most important element of the pack is the InfinityWeather/BP_InfinityWeatherController blueprint that gives the users possibility to control the weather conditions like fog, wind direction, precipitation, and accumulation.
Drag and drop BP_InfinityWeatherController on your level to start controlling the weather.
Atmosphere effects requires Exponential Height Fog plugged into the BP_InfinityWeatherControllers to work properly. Remembet to polace height fog actor on your map and set the attribute of weather controller.
A list of weather controller attributes looks simple but it’s a very powerful tool. For testing try to set some rainy windy weather using parameters like below:
The system can be controlled dynamically by blending between parameters during the game using the functions below.
The precipitation system is the most advanced part of this pack. It is based on a single material combined with a single mesh (no Niagara or Cascade). The effect of billboarding is computed in vertex shader in the local space of the camera. That means the system has almost no PCU cost and works ultra-fast.
There are a few predefined effects of precipitation defined in the project (/InfinityWeather/Precipitation/)
The precipitation effect can be used without a weather controller. You can drag and drop the blueprint of the chosen effect on the map to use it as static volume.
A manually placed actor can be configured and limited by the area of precipitation. Users also can set multiple weather types visible locally in some areas of the map. All parameters are described in editor hints.
You can create your own precipitation effect classes for specific rain/snow configuration by extending the base class and changing attributes. That newly created class will be available on the Precipitation Effect Class list in weather controller.
The /InfinityWeather/Precipitation/BP_InfinityPrecipitationOcclusion is a specific type of actor that contains render target texture used for calculating precipitation collision with the roofs and other cached meshes. After placing this actor on the scene you will notice that in the area of the occlusion map the rain and snow are not rendered under the meshes.
Currently supported is only one occlusion map per level but its planned to implement switching between multiple occlusion maps dynamically.
The wind is based on the integration of multiple effects that react to a simple wind direction vector.
Currently, the wind effects can be controlled globally on the full scene using the Wind Direction vector in BP_InfinityWeatherController.
Most of the objects that react to wind use the Environment/Functions/MF_InfinityWind material function that returns all important information about current wind status. It’s the simplest way to get global wind data.
List of implemented effects and objects that reacts to the wind force.
Precipitation direction – Attribute Wind Force in precipitation blueprint scales how much it affects
Vegetation – MF_WindBush, MF_WindTreeLeafs, MF_WindTreeTrunk, MF_WindGrass nodes implements the effects of wind on vegetation.
Flags – specific implementation of flag material M_Flag.
Cascade particle emitters (Wind Affected Emitters attribute in Weather Controller is a list of emitters that should use Wind Direction and Wind Intensity parameters to react. (Example Environment/Dust/PS_SnowBlowingLarge)
Dust – predefined planar dust effects that rotate and fade based on wind direction.
Grass – MF_WindGrass material node.
Landscape dust – MF_GroundDust effect on the sandy landscape.
Clouds shadow – M_CloudsShadow
The BP_InfinityFog actor can be used for changing fog settings in some areas. Fog actor is divided into two parts
Fog volume – The volumetric shape of fog is rendered as overlay mesh.
Fog atmospherics – Atmospherics settings. When the camera is inside the Fog volume the controller uses those settings to blend into new parameters based on the Weight attribute value.
Currently supported is only the ellipsoidal shape of fog.
Accumulation is the group of parameters that controls coverage of snow and wetness of meshes that uses the MF_InfinityWeatherSurface node in the master material.
MF_InfinityWeatherSurface is an advanced node that adjusts base color, normal, and roughness values to current weather conditions set by the two basic parameters that are used in the material.
Weather.Controller.WetSurface – Controls wetness of the surface
Weather.Controller.SnowySurface – Controls the snow shell on the surface.
An example of MF_InfinityWeatherSurface use is presented in the rocks material M_RockSnowMaterial. Drag and drop rocks mesh(/InfinityWeather/Demo/Environment/Rock/SM_Rock ) on your map to check how weather conditions affect the material. Notice that the combination of wetness and snowy material can result in a nice-looking effect of icy snow.
Additionally, meshes can be painted by vertex color (red channel) to mask the effect on meshes that are hidden under the occluders.
A more detailed description of additional parameters can be found in the landscape and displacement section.
The Infinity Weather system introduces an advanced displacement material system based on render targets. In short, the BP_InfinityDisplacementCapture actor is searching for actors around the focus point, that implements BPI_DisplacementShape. BPI_DisplacementShape implementation adds the list of shapes (a type of shape and transform) to the stack. In the final stage stack of shapes is rendered to displacement texture. The iteration is repeated every frame with some random offset.
3 elements are needed to make the system work:
Displacement receiver landscape with the material that supports displacements. (MI_LandscapeSnow/MI_LandscapeMud/MI_LandscapeSand, MI_LandscapeCombined)
Displacement capture actor placed on the map (BP_InfinityDisplacementCapture), set to capture landscape ground.
Displacement mesh actor or component that will affect the ground (BP_InfinityDisplacementStaticMeshActor)
The video below shows how to combine all these tools and make them work:
Landscape material is an advanced topic because requires basic knowledge about an unreal material system to inject them into the project.
Example content (Landscape/Materials) comes with a few examples of the landscape configurations to make this step easier. There are simple materials that cover the landscape fully by a single type of displacement:
M_LandscapeSnow (uses MF_InfinitySnow)
M_LandscapeMud (uses MF_InfinityMud)
M_LandscapeSand (uses MF_InfinitySand)
There is also an advanced version of the material that is a combination of all three effects in one landscape. Additionally, the extended version supports the MF_InfnityWeather. Available in two versions virtual texturing and multilayered
Let’s take a look at the example Sand (M_LandscapeSand) material used on DesertMap:
The material is a combination of four nodes:
MF_GroundDand – it’s the base material layer of sand that contains a color map, normal map roughness, etc. It can be replaced by any material for example created from GameTextures/Example Project/Any pack.
MF_InfinityDisplacement – is a base node that reads displacement render target and returns data from it. As an input, it takes the displacement layer intensity. It’s the layer that will be used to paint the landscape.
MF_InfinitySand – combines data from displacement and ground layer. Additionally, this node implements some additional effects like coloring displaced ground. There are another two nodes that could be used here to achieve another effect MF_InfinityMud, MF_InfinitySnow.
MF_InfinityWeather – adding the overlay effect of a wet surface that can be controlled by the weather controller.
The BP_InfinityDisplacementCapture is the main actor that prepares displacement depth textures for landscape. Below is a simple explanation of how the algorithm of this actor works:
Displacement capture actor is searching for actors that overlap the area of displacement.
If an actor that supports the displacement interface is detected then he is asked about list shapes to displace ground.
New generated shapes are added to the stack.
In the final step, the displacement capture actor renders all shapes added to the stack.
A simple configuration of the scene with displacements is presented below:
Drag and drop BP_InfinityDisplacementCapture on the map.
Add the Landscapeactor to BP_InfinityDisplacementCapture.GroundMeshes.
it’s an important step to notify the system about the receiver of the displacements.
Select landscape and use the material that supports displacements for example M_LandscapeSand.
Edit the landscape and paint the layer of displacement on it.
For optimization, displacements are rendered only in the area of the focus point. The focus point is taken from BP_InfinityDisplacementCapture.CaptureActor. If the CaptureActor is null then it uses the character Pawn as the focus. If Pawn is null then it uses camera location.
Two types of render targets are defined in the system. Both area sizes can be changed in the capture displacement actors.
Capture Render Target – cache shapes around the focus point. Modify the CatureTextureSize attribute.
World Render Target – combines all cached data. Modify the render target TR_Persistent.resolution to adjust the size of history.
Simple static shapes
The displacement map rendering is based on ray tracing shapes in the shader to get the best possible efficiency but it’s also limited to the number of shapes predefined in shaders. The shapes that can displace the ground are defined in the /InfinityWeather/Displacement/Blueprints/Shapes/ folder:
BP_DisplacementDecal (heightmap mapping on the ground)
All of them can be placed on the map and scaled.
Use Debug Option in BP_InfinityDisplacementCapture to preview shapes that are rendered by capture displacement
BP_InfinityDisplacementComponent is a component that allows attaching a list of shapes that affect displacement ground.
The example implementation of displacement component in clean mannequin character presented on the video below:
Adding displacements to your character is easy:
Open character blueprint, add BP_InfinityDisplacementComponent to your character.
Select the component and set Displacement Shapes data asset. You can use full ragdoll (DS_MannequinRagdoll) or foot (DS_MannequinFoots) implementation that is faster and prepared for walking.
Select “Class Settings” on the top bar and add the BPI_DisplacementShape interface to the list of Implemented Interfaces. It should look as on the screen:
Now implement the interface function called AddDisplacementShapeData.
After those few steps character will be detected and ready to work.
Custom collision type
The Infinity Weather displacement capture system uses box collision volume during searching the actors that should be captured in displacement buffer. The overlap detection requires a specific collision type for filtering collision shapes. The engine has no option to include custom collision type into the external marketplace package because it’s defined in the configuration files.
By default, Infinity Weather takes advantage of using predefined destructible collision type but it’s not the best solution for every project, this collision type can be occupied by other functionalities and turned off for specific actors like characters.
The best way to overcome this problem and also improve performance during overlap test is by creating a custom collision type and using it to detect shapes that should be detected in the displacement area This solution will also fix all issues with ALS ragdoll detection and other strange behaviors that could occur after placing BP_InfinityDisplacement actor on the map.
Creating custom collision type is described in the Unreal Engine 4 documentation and the explanation below shows how to apply this knowledge with Infinity Weather:
1. Find Edit->ProcectSettings->Collision
2. Open the ObjectChannels tab and add a New Object Channel type called “InfinityCapture“, set to ignore by default.
Now you have to set a new collision type in the displacement capture actor.
It’s good practice to inherit BP_DisplacementCapture class and edit the child class instead of changing package assets. If you do tit this way then the next package update will not break your project.
Open BP_DisplacementCapture (or child instance) and find CaptureVolume component.
In CaptureVolume->DetailPanel->Collision->CollisionPresets->ObjectType set the InfinityCapture type.
Since now every actor that has set Collision Response->InfinityCapture->Overlap true will be considered during the displacement detection process.
Find character class select collision component that should be detected (CapsuleComponent or Mesh)
In details panel find Collision tab -> Collision presets
If the collision preset type is “Custom…” then just switch InfinityCapture->Overlap to true.
Otherwise, you will have to set custom or edit existing preset the same way in project collision settings.
Remember to repeat these steps for all actors that should be detected by displacement capture like BP_InfinityDisplacementStaticMeshActor.
You can use debug option in BP_InfinityDisplacement actior to see if the actor is detected.
Displacement data asset
The configuration of displacement shapes attached to actors is stored in Data assets (PDS_DisplacementShapes).
Infinity weather contains a few examples of data assets for mannequins and vehicles but any user can create custom data assets that will work with specific skeletal mesh. Creating a displacement data asset is simple:
Press the right mouse button in the content browser
Pick PDS_DisplacementShapes and create
Set name of newly created data asset “DS_ExampleDataAsset” for future use.
Newly created data assets can be used in the characters but we still need some shapes definition. Shapes can be added by hand but Infinity Weather comes with a simple editor that helps preview how shapes are attached.
Drag and drop editor class on map (BPU_InfinityDisplacementShapesEditor).
Select editor, and chose newly created data assed (DS_ExampleDataAsset) in Data attribute.
Pick the skeletal/static mesh reference that will be a visual representation of the shape that makes displacement. It can be your character or car.
Press Load and now you can add shapes to the List. The shape structure is described below.
After the working press “Save” to save changes. Remember to apply the newly created data asset in DisplacementComponent.
Bone or socket name used as an attachment transform. The system will uses component transform if the socket name is not defined.
Shape type. Currently supported: Box/Sphere/Cylinder/Capsule/Decal, Trail Wheel, Trail Sphere
Scale the intensity of interaction (not implemented yet)
Relative transform in space of attachment.
The texture is used as a pattern in the displacement decal.
Weather System also supports the advanced footstep system integration based on notifiers placed in animation.
Working with footstep system:
Add BP_InfinityFootstepComponent component to a character. The configuration of components contains default predefined templates of effects and attachments for the mannequin.
Add BP_NotifyFootstep to the animation of walking/running in place when the foot hits the ground. Chose the left or right foot in the properties.
Add BP_FootstepVolume on your scene and select the preset of footstep that should be spawned inside of the volume.
The engine sometimes wrongly detects the footstep ground because of limited landscape layers blending during ground tracing. Footstep volumes BP_FootstepVolume can solve this problem easily. When the character is inside the volume then the system uses the footstep set in the volume definition so the incorrect footstep effect is overridden.
The priority and Required conditions are taken into account to choose the most relevant footstep effect. It can be even filtered. The example below shows how to config volume on second priority (higher priority more important) that spawns snow and will be visible only when there is at least 0.1% of the weather displacement active.
The BP_InfinityPostProcessComponent placed into an actor that contains the camera component will add automatically the post-process material and communicate with the weather controller to get info about the current conditions.
Three types of post-process views objects are supported:
Postprocess camera – by default system search for the camera in owner.
Postporcess component – if there is no camera in the owner then the system search for a component.
Postprocess Volume – custom volume can be set for the character on the scene by setting the Postprocess volume attribute.
A custom post-process object can be also set by the function SetPostProcessView.
Raindrops and circles show only when the precipitation actor has IneractWeather > 0.0. That means raindrop shows only inside the rain volume.
Screen raindrops fade after some time. The duration of the fade can be controlled by BP_InfinityPostProcessComponent.Rain parameters.
When the camera is under the cover it’s not getting wet and raindrops are not appearing.
Freezing and heat haze distortion are calculated locally from the weather controller and fog actor’s atmosphere. (temperature, distortion)
The screen effects are implemented using two post-process materials.
MI_InfinityPreTranslucent – a group of effects rendered before the translucency layer.
Rain Circles – Enables automatically when rain precipitation is active.
Distortion – Heat Haze effect that enables when
Glittering – Experimental disabled effect of glittering rendered on snow and sand.
MI_InfinityPostProcess – a group of effects rendered after the tone-mapping.
Sharpening – Additional sharpening effect that improves the quality of close objects (can be disabled) by UseSharpening.
Drops – Animated raindrops on-screen are visible only if the camera is exposed to rain.
Frozen – Frozen screen edges.
The replication of the weather is implemented on the server-side. That means all functions from BP_InfinityWeatherController that controls weather should be activated on the server and then the state will be automatically sent to clients. The state of the weather is also replicated when the player joins to game after some time.
Displacement render targets are not replicated but only calculated locally around the focus point of displacement capture actor that follows the local player 0. That means if multiple actors are walking in the same area at the same time then the result will be calculated locally and similar for all of them. It behaves this way not because it’s replicated but because it’s simulated locally with the same data.
During testing displacements in multiplayer be sure to run it in separate instances of the game, otherwise, UE4 shares the memory of ground texture and there are bugs because multiple displacement actors render the same displacement texture at the same. This issue does not affect the final released game so don’t worry. Use option SingleProcess=false for testing the game in the editor with proper displacements in multiple windows.
Future features that will be implemented for multiplayer games:
Local weather controllers are not supported yet but it’s something on my roadmap.
Replicated foodprint decals that will be sent to clients and activated when displacement actors close.
Advanced Locomotion System
The most important part is implementing a custom collision type that will solve most of the problems related to ALS and IW volume overlapping. You can find detailed descriptions of this step in the chapter called “Custom Collision Type”.
After you finish this collision configuration then additionally it is worth setting Climbable = Ignored in Trace Responses of Capture volume component because it’s not needed. The final configuration of the InfinityCapture actor should look like below:
It’s also important to update ALS_character collision preset in the project config. InfinityCapture channel should be set to overlap.
When the collision setup is ready then it’s time for ragdoll. During ALS ragdoll mode capsule overlap is not detected and that is why displacements are not rendered by default. There are two possible methods to fix it and choose is on your side:
Select ALS Character Mesh and switch Generate Overlap Events = true, and InfinityCapture channel to Overlap.
Alternative more general method. ALS_Character -> Add Component ->Sphere Collision to character actor and configure attributes as presented on the screen below:
There is another important thing about ragdolls that is switching between presets ragdoll/feets.
The DS_MannequinRagdoll preset is slower and more precise, the DS_MannequinFoots preset works perfectly in low FPS because it supports continuous movement. Fortunately, ALS can support this feature very easily ALS_AnimMan_CharacterBP has functions RagdollStart and RagdollEnd that perfectly fits the requirements of switching.
Use SetDisplacementShapesDataAsset on the DisplacementComponent.
RagdollStart should set DS_MannequinRagdoll asset as a parameter
RagdollEndshould set DS_MannequinFoots asset as a parameter
Questions & Answers
I can’t see any precipitations under the 0.0 of the level.
Add BP_InfinityPrecipitationOcclusion on your map.
Explanation: The system by default reads 0.0 as the default occluder. BP_InfinityPrecipitationOcclusion will analyze your level occlusion and update the height of the ground.
Why landscape is flickering after painting the displacement
Solution: Use a geometry brush on the landscape to displace mesh a bit and mesh will not disappear anymore.
Explanation: It's an engine bug, that I can't overcome but it's easy to fix in your project. The only way to fix this is to increase the size of the landscape component bound box by editing the ground. The bug is related to the bounding box of landscape fragments. If landscape bounds are hidden under displacement mesh (extruded using vertex offset|) then the occlusion system removes the fragment of landscape. It flickers because the occlusion is probably calculated based on the previous frame and in the previous frame, the component visibility was the opposite.
I am very happy to introduce my new game demo created especially for the NVIDIA spotlight contest.
Lightning Fast is a powerful combination of materials and blueprints that implements a wide range of realistic and stylized electricity effects.
Lightning effects have plenty of uses in games, from background ambiance during a storm, electronic fences to the devastating lightning guns and spells. This system is designed in mind to achieve all those effects with high performance, AAA quality at the same time.
Two types o beam rendering (spline mesh/spline billboard mesh)
Integrated with UE4 dynamic lights and material light functions and splines
Over a hundred parameters in materials to customize the final effect
Spline based shape of bolt easy to adjust to the scene
Character electrocute discharges effect
Advanced lightning mesh editor tools
Multiple animated lines rendered in a single beam
Depth fade/direction fade effect
Glow and line color
Refraction based distortion
Procedural beam texture
Dynamic branch masking and fading effect
prototype pack – textures and materials created for demonstration purposes
mesh generator – advanced tool prepared for editing and generating lightning meshes, allows implementing external lightning propagation class
demo example map – a showcase of 14 use cases
Procedural mesh lib – Additional library of functions that helps to create static meshes in UE4
Basically everything starts from the geometry (meshes) and effect(materials) rendered using geometry.
The Lightning Fast mesh combined with the material has some useful features and properties that are very important in the production.
can be used witch spline meshes
can be skinned and used as skeletal mesh
can be scaled and keeps the proper shape
can be used in a particle system
can be watched from every angle
That gives us a wide range of effects that the system can handle so the package is divided on multiple blueprints specialized in custom effects.
All of the effects can be found in the Blueprint folder and are presented on ExmapleMap (Demo/Maps/LightningFastMap)
Find folder Blueprints/LightningBolt
Drag end drop BP_LightningBolt on scene
The effect is ready to use. Now you can adjust parameters to get the effect that you need.
Blueprints folder contains multiple lightning effects that show how the system works with materials and meshes is should be treated as example content or templates prepared for general use cases. Some of the effects contain blueprints the other are represented only by particle effects used that main character is using.
Check the Demo/Manequin/ThirdPersonCharacter blueprint to analyze the implementation of the skills.
The list of effects will be extended in future updates.
It’s good practice to create your own effects by inheriting Blueprint classes and copy material instances to your project folder.
The Blueprints/BP_LightningSpline blueprint is a simple combination of mesh spline effects and lights. You can combine any mesh created using Lightning Fast package with splines even bending the lightning bolts is allowed.
Drag and drop BP_LightningSpline on the scene and modify parameters. [Editing splines]
Mesh used to bend
The material used on mesh
The axis of mesh that is directing forward of bending.
Translucency Sort Priority
Fix Spline Direction
Additional correction of spline UP direction. Fixes some twist bugs.
Scale texture UV for beam start and end.
Override material glow color.
Override material line color.
Number of lights created for spline
Light Snap Curve
Importance of snapping the lights to the curve (0-1)
Override color of light
Fall off parameter of light.
Use inverse squared falloff
Fall of the mode switch.
The light material function used for blinking
BP_LightningBolt is a complex example of lightning bolt effect implementation based on lightning spline blueprint. The biggest advantage of this blueprint is that can be easily integrated with every weather system.
Bolt animation curves describe how parameter changes during the update. R – time offset, G – fade, B -(not used yet), A- lightning intensity
Speed of animation update.
-1 (infinite), 0 – not active at start, 0> loop counts
Delay until the first activation.
Draw debug preview at some time of animation.
Random time in range (x-y) that system wait until next activation.
Whether the system should randomize rotation and location in Random Volume.
Range of rotation change during randomization.
The world that manages the light intensity on the scene
Use parameter World to register blueprint in BP_LightningWorld. The BP_LightningWorld actor is the manager that changes directional light intensity and skylight intensity during the stormy weather according to all managed lightning bolts.
To activate Lightning Bolt by hand (from events) use Activation function and set ActivationLoops = 0.
Dynamically updated chain of splines that connects target points in the closest range.
Simple use case of working with a beam:
Drag and drop BP_LightningBeam on scene
Drag and drop a few target actors (BP_lightningTarget) on the scene near to the lightning beam actor.
Select BP_LightningBeam and edit parameters: IsActivated=true
Find InitChain array and add one element. Setup the starting chain element.
Set TargetActor to self it will make the beam actor the source of effect.
Play and watch how the beam connects the target actors that are moving around.
Basically the concept of BP_LightningBeam actor is based on the idea that the chain should be represented by multiple nodes connected by spline meshes in a specific order. The list of nodes and order can be set by the user or generated automatically.
The BP_LightningBeam effect is divided on the static and dynamic parts.
Static (set by user) – predefined, represented by InitChain structure created by the user during activation.
Dynamic (generated) – generated starting from the last element of the static fragment using an algorithm of searching in range.
InitChain is the array that describes the connections between chain elements. The first element of the array is a root (starting point). For example, the Init Chain of lightning guns can be represented by 2 nodes. The first node is the gun muzzle and the second one is the hit point. The ParentIndex is the index of the parent node in the InitChain.
You can even specify a looping shape in init chain by this kind of config:
InitChain = (TargetActor=Target0, ParentIndex=0) //start, points to self so it’s empty
InitChain = (TargetActor=Target1, ParentIndex=0) //spline from element 0 to Target1
InitChain = (TargetActor=Target2, ParentIndex=1) //spline from element 1 to Target2
InitChain = (TargetActor=Target0, ParentIndex=2) //spline from element 2 to Target0 (loop)
Init Chain Node
The actor used as node in the chain. The algorithm uses the location of the actor as the endpoint of the spline mesh.
The index of element InitChain array where the algorithm will look for the starting point of the spline mesh. InitChain[ParentIndex].TargetActor is used to set the location.
If the target actor is null then Target Location is used.
Overrides the normal direction of spline mesh. It can be used to change the shape f spline.
Overrides the source direction of spline mesh. It can be used to change the shape f spline.
How to react if the actor is destroyed/null:
None – use TargtLocation and do nothing
Deactivate – deactivate the actor
Generate – regenerate the chain using dynamic generation.
Remove – remove the node and work as before.
The last element of the InitChain is the starting point of searching. The searching uses the attributes in the Search tab to specify which actor should be in the dynamic chain.
After the InitChain is specified then there it comes to dynamic chain generation. ChainLength is the number of spline meshes that can be rendered by BP_LightningBeam actor. The ChainLength attribute is used to calculate how many additional nodes should be generated. The number of the dynamic nodes that algorithm will look for is ChainLength reduced by the number of renderable connections specified in the Init Chain.
Example situation (lightning gun) ChainLength = 4 InitChain contains 2 elements, the connection between start point(muzzle) and one target point is represented by 1 spline mesh.
Finally, algorithm will look for 4-1 = 3 target points and try to render 3 additional spline meshes.
The algorithm of searching is simple:
Find closest target actor in range to the last element of the chain
Add the element to the chain.
If CombinedChain is smaller than ChainLength then goto 1
After the searching process, all the nodes are linked together in a chain 0-1-2-3 … -n. UseClosestToChain=true option runs an additional update that changes the topology of the chain to render the shortest possible spline meshes. Basically it looks for closest connections between nodes.
A complex example of Lightning gun implementation based on Lightning Beam blueprint.
Particle emitter that presents discharges effect.
Effect of trail attached to the actor example content was used as a ribbon behind the character during a slow-motion run.
Object targeted by lightning gun and other skills. Target also spawns an example discharges particle when it is attacked by a lightning beam or lightning gun.
Folder contains a base class of dynamic light source components and light materials.
M_LightningFast is the heart of the package and all material subtypes derive functionalities from this advanced master material.
Creating material instance in UE4 is really simple. Click right on the material (M_LightningFast) and chose option “Create Material Instance”. After that operation the newly created material will be ready to configure and use on Lightning Fast meshes. You can do the same with all material instances created for specific cases stored in the blueprints folder which I am recommending to do first.
The material has a very high potential for setting up and create a wide range of effects. The following list includes all available configuration options with usage recommendations. Some of the parameters can be hard to describe and understand at first but I encourage you to test it in the engine and check the results of changes. 🙂
List of general basic options.
Scales the geometry of billboarding mesh.
Opacity clamping for high values.
True – use billboarding system for rendering, false – render original mesh
Draw debug mesh
Use the refraction effect.
Clamp intensity of refraction effect.
Optimization that forces system to use interpolators (does not work with particle emitters)
Group of parameters that helps to fade effect in some specific conditions. Close to the wall or unpleasant angle.
Normal fade is the feature that helps to hide beam when its in bad wrong related to mesh. The fade is calculated based on the normal vector of the mesh. For the mesh in billboard mode the normal means axis of rotation. When the Axis is similar to the camera direction then mesh should be invisible because we can notice some bugs.
Scale the effect makes it fade faster or lower
Offset the effect to make it more visible.
Makes effect translucent.
Whether lightning should fade when intersecting scene meshes.
How fast lightning should fade when intersecting scene meshes.
Vertex color of lightning meshes contains additional information about fade this option allows to use the for fading begin and end of the node.
Scale vertex fade start
Scale vertex fade end
The lightning beam is build of four branches that can be displayed at the same time or one after one. This section covers the basic settings.
Activate the algorithm of asynchronous fading of branches.
Alpha multipliers of each branch of beam stored separately in RGBA channels.
Time offsets in fading animation.
Exponential fading speed of endpoint.
Fading blend in of start point.
The scale of fading UV.
Branch fading animation speed.
External controller of branch fading time.
Combine branches into one beam. Set False to use separate colors.
Combine branches by max function. False forces system to use sum function.
Use the fourth branch from the alpha channel.
Masking is the feature implemented for randomly hiding the branches of lightning. Even static meshes can look dynamic and various thanks to masking some of the nodes during the update.
Beam configuration focused on UV mapping transformations.
UV mapping of distortion texture:
RG – is the xy scale of beam distortion UV (higher value more dense UV)
BA – Speed of UV moving (higher value more dynamic effect)
Additional skew UV offset stretches the distortion.
Beam blend is the effect of the fading beam on start and end.
The intensity of start beam blending.
The intensity of end beam blending.
Adding UV offset based on vertex color R-value. Parameter helps to make mesh beam look more variable.
Most important parameter
Bolt is the effect that allows animated offsetting of the beam position on the mesh.
Whether the bolt animation should be active.
Start/end position of bolt.
Sets the offset of bolt animation (can start from any point)
Sets the value range of bolt animation.
Speed of bolt animation.
Each beam is created of multiple light lines that are distorted by noise texture. The texture is moving so the effect looks dynamic and chaotic.
Scales the distortion effect.
The texture used for distortion offset and detail distortion offsets.
Whether distortion detail should be active.
Scales the UV mapping of detailed distortion.
Scales the intensity of detailed distortion.
Describes basic parameters of the line that is distorted and generates glow.
Color of line
Color of glow effect around the line
The texture used for masking glow.
Animating glow texture RG-Scale, BA-Move
Whether the glow texture effect should be active.
Width of line.
Line disappearing in distance from the start point of lightning.
Material supports two modes of rendering glow effect around line Hard/Soft. The first one based on smoothstep function and the second one based on division by distance.
Lightning Fast package contains mesh editor BP_LightningMesh that allows creating unique lightning meshes. Drag and drop blueprint on scene Blueprints/LightningMesh/BP_LightningMesh to start creating a lightning mesh.
Each mesh is generated from a node list (attribute Nodes array). The node list is an array that contains a graph of connections between nodes described using a special structure called BS_LightningNode.
This attribute describes the index from an array of the parent nodes. Each node has a parent. If the parent of the node has the same index as the source node than the parent is the root(starting point) of the hierarchy generated from the node list.
Location of the ending point. It can be edited visually in the main viewport as well by selecting and using translation gizmo.
A value higher or equal 0 is the index of loop endpoint. The -1 is the default and means that there is no loop.
Higher priority means that the node describes the main branch.
The scale of the tangent input-output vector.
The package comes witch additional tools that help to create unique lightning meshes in the Unreal Engine 4 editor. There are two methods of editing Nodes array.
Editing by hand using Tree Editor
Editing using Tree Generator
Tree editor is the basic method of editing the node list there are multiple buttons that increase the speed of building the new branches of lightning mesh.
Append the child to the selected node (selected node index).
Divide the selected node (selected node index) by inserting a node in the middle.
Removing the selected node (selected node index).
Node Selected Get
Load data from the selected index to Selected node Data.
Node Selected Set
Save data from Selected Node Data to the selected node (selected node index).
Transform a selected node by Transform to apply attribute.
Clear the tree.
Lightning Fast has an advanced system of mesh generators that can be used to automatize the creating of wide range meshes.
Working with generators is simple:
Add generator to the Generator List used by the BP_LightningMesh editor.
Chose the class of generators LTG_Branch
Click Evaluate All Generators
Double click on the generator instance and edit parameters and click “Evaluate All Generators” again.
BP_LightningTreeGenerator is the base lightning generator class that can be extended by other generators by overriding the Evaluate function.
Example generators provided in version 1.0 of the package described below:
LTG_Branch – Example generation that creates branched lightning effects.
LTG_Spline – Example generator that creates lightning based on spline shape.
Create static mesh
A generated mesh can be easily converted to static mesh and used in other blueprints and particle effects. Few steps needed to do this:
Select BP_LightningMesh blueprint and find component called ProceduralMesh
Find option UseDynamicMaterial and set to false.
Click button Create Static Mesh
Select where to save the mesh
UseDynamicMaterial = false forces system to use default not instanced material. If you will skip this step then newly created meshes will contain reference to instanced material ant wont save.
Questions & Answers
Integration with Fast Stylized Procedural Sky?
Yes! Last update of Stylized Procedural Procedural Sky supports integration with Stylized Procedural Sky just place lightning bolt on map and activate it on event from build in lightning system.
It is worth buying?
Hell yes! After four mounts of:
researching the topic,
watching thousands of photos and videos with lightning effects,
testing multiple solutions,
removing huge amounts of unsatisfying effects
hard time spend on iterations and optimizations
I can ensure you now that it's not worth to do it by yourself from the beginning when you have finished solution and my support. I would never start again if I would could... 😉