Aras PranckevičiusLatest answers
Hi Aras, I've seen at GDC that Unity is working on new rendering pipeline, then i found google doc file about it and there are things like Volumetric Sky/Fog & Atmospheric Scattering, does it include some Water system and also day/night system like the standard in CryEngine/other engines? Thank you
I don't actually know :|
When is GI applied in the deferred pipeline? Is it applied with all the lights or after Final Pass? (or somewhere else?)
During the G-buffer pass, ambient & lightmaps & emissive things are rendered into the emission buffer.
This is fairly easy to see using the Frame Debugger by the way.
Hey Aras, in the rendering pipeline, what's the order priority of lights? I've noticed it seems to do directional lights last, but point/spot light seems based on when the light was enabled/disabled. Another instance looked like it may have been based on distance from the camera. Is there an order?
I don't think the order is defined in any particular way. If you want sorting of lights in some way, probably would want to sort them yourself.
Hey Aras, I need to show the Unity _ShadowmapTexture on a texture. It works for a ATI GPU , but not Nvidia. Any idea?
Depends on how you sample it , what platform etc. I'd find it unlikely that it would be GPU-dependent; my guess is that you're perhaps using it in an "undefined" way.
The built-in shadowmap texture is valid only for the duration that "shadows are rendered at", so e.g. sampling it outside of deferred lighting pass, or outside of forward rendering pass that applies lighting (ForwardBase/ForwardAdd) is likely undefined - it can get that texture, or some other texture, or nothing at all.
Also, depending on the platform, depends on how you want to sample it. E.g. on DX9, you can't sample raw depth value from a shadowmap at all; it can only do depth comparison.
Hi Aras, are there any platforms that you are aware of which run unity and are big endian?
I think WiiU is the only one right now (PS3 & Xbox360 were big endian too, but we already do not support them). No idea whether big endian will ever make a comeback in consumer or gaming CPUs. I guess you never know.
Hi Aras, What resources can you recommend for graphic programmers? Just wondering where my go to places should be to find quality material when trying to implement new feature X in my game. realtimerendering.com, dl.acm.org, GDC Vault,...?Elvar Orn Unnthorsson
Basically the above, yeah.
"Advances in real-time rendering" siggraph course has some awesome material each year: http://advances.realtimerendering.com/
"Pysically based shading" course at the same siggraph is also good material: http://blog.selfshadow.com/publications/
Hello Aras! What's your favorite game/s made in Unity?Jacob Smaga
In last year, probably INSIDE and Firewatch.
I loved Monument Valley, TIS-100, Year Walk in the year before.
Hi Aras, I am working on a system that uses std::hash C++ (based on murmur?). I am experimenting to use different hash functions, and tried cityhash64 and xxhash64, also std::hash for integer keys and cityhash64 for string. However, cityhash is best for overall CPU. Any suggestions? Thx! Jing
Depends on what you want & what your needs are, how long are the typical keys, what platform you're on etc. In some tests that I did (http://aras-p.info/blog/2016/08/09/More-Hash-Function-Tests/), CityHash64 was looking very good indeed. If it works for you, great, use it!
Someone asked about GDC so my question is: Is there going to be any live where people can watch Unity's presentations?
The keynote on Feb 28 should be live here https://unity3d.com/gdc2017 -- not sure about other talks. I guess the others might not be live streamed, but recorded.
Hi Aras, Is ScriptableRenderLoop/HDRenderPipeline going to be shown at GDC on "The Future of Rendering in Unity" presentation?
I do think the concept & design of "scriptable render pipelines" is absolutely the focus of that talk overall. But don't know the details, it's not me who's doing the talk.
Hello Aras! Is ScriptableRenderLoop/HDRenderPipeline going to be shipped in Unity 2017.x?
That's the plan, yes.
You can already use an older version of it (at your own risk) in current Unity 5.6 betas; the github project contains information on which revisions of it should be used.
Basic things are already working fairly well; the largest piece of the missing puzzle is nice UX for all the things. For example, how built-in scene view rendering debug modes should interact with debug modes provided by the HD render pipeline itself. Where various settings should be put in the UI. etc. etc.
Hi Aras, since somebody asked about Volumetric Clouds. Do you happen to know of any good real-time approaches that is capable of rendering clouds from afar as well as up close? Looking for a solution where I have a vehicle on the ground that could jump up to the sky and fly through clouds.Elvar Orn Unnthorsson
Nothing comes to mind right now, but I have not been following that area.
My guess is that "flying through clouds" and "clouds in the distance" likely need somewhat different systems/approaches for rendering.
"The Real-time Volumetric Cloudscapes of Horizon: Zero Dawn" from http://advances.realtimerendering.com/s2015/index.html I remember being fairly interesting, but I forget whether they handled "flying through clouds" case.
"A Novel Sampling Algorithm for Fast and Stable Real-Time Volume Rendering" from the same siggraph course might be useful for cloud rendering part too.
Hi Aras, I've found this https://docs.google.com/document/d/1e2jkr_-v5iaZRuHdnMrSv978LuJKYZhsIYnrDkNAuvQ/edit and in the “HD Render Loop” section there is info about "Volumetric Lighting" and "Sky/fog atmospheric scattering model.". Does it includes Volumetric Clouds? Thank you and greets!
Various volumetric things (sky, clouds, scattering etc.) are on the roadmap, but not very much near "top" of it, and timelines are uncertain.
Just noticed my color ID texture gets blurry when overriding its max size in the Asset Importer. Setting filter mode to Point doesn't help. Is there any way to get a smaller resolution and keep hard pixel edges? Currently I resize it in Photoshop with Resampling set to Nearest Neighbor :)Simon Kratz
Yes, today clamping max texture size in unity always downsamples the texture with something like a Mitchell filter, and ignores the GPU filtering settings. So you have to do that externally. Maybe worth filing a bug so that someone would remember to fix it one day.
When can we expect good looking/not blurred UI Text in Unity? Greets!
I don't know, but people keep on asking this question :) ...to which I still answer with "I don't know"... https://ask.fm/aras_pr/answers/142086711404
Do you think Microsoft is trying force people into UWP and turn Windows into a closed platform? Do you anticipate Unity will ever have UWP version but will circumvent the windows store?Salim
I don't know wrt Microsoft's plan. Personally, I don't pay much (or any?) attention to UWP. Never used the Windows Store.
Unity already can build apps for UWP of course, as well as regular Win32/64 apps. Unity editor itself is a Win32/64 application, and I don't see it becoming an UWP app anytime soon. Or reasons to do that.
What do you think about Otter Browser?
Never heard of it :)
Hello Aras! What's the situation on new Terrain System? Thanks!
It still has not shipped :(
Hi, Aras? Thank you for your work. I have a question about GLSL. I need to build Ogre with this library, but shared. I recompiled the package with the flag -fPIK but cmake not see these libs, even though I made a symbolic link, and run ldconfig. Please tell me how I can do glsl optimizer shared.
I'll assume you want to integrate glsl-optimizer into Ogre...
However, since you're talking about -fPIK and cmake, I'll also assume this is about a platform that I know nothing about (Linux by chance?). So, uhh... no idea. I know how to build things on Windows and macOS, but Linux I have zero knowledge about. glsl-optimizer itself is just bunch of C++ code that needs to be compiled, without any special things done for it. So "whatever Linux people do to build dynamic libraries on Linux" is the best answer I can give :)
Hi Aras! Is there any way to get Splash Screen like this one? 0-0:30 https://www.youtube.com/watch?v=9_YJnor4XH4 . Greets!
You'd have to implement that splash screen as a separate "level" of the game, with all the effects that it does, and make it the first level in the game. Start all the effects, and start loading the "actual level" in the background, using LoadSceneAsync (https://docs.unity3d.com/ScriptReference/SceneManagement.SceneManager.LoadSceneAsync.html).
Hello Aras! I'm working on a game where we are displaying a bunch of small/big texts, the problem is that small text is blurred/ugly. Is anyone from the UI team working on a better Font/Text rendering? Greets!
Sounds pretty much like this other question a few days ago. http://ask.fm/aras_pr/answers/142045215852
re: command buffers Example: I have a point light. I have a second sphere. I only want the objects inside the second sphere to be effected by the point light.
Yes, so what I wrote before, and the solution depends on whether you're using forward shading or deferred shading.
In forward shading, you'd have to do sphere check inside the shader part that evaluates the point light, and don't add any illumination outside of that sphere's range. This means all the shaders of all objects in the scene need to know about this, an be modified to do this. In deferred shading, the change would be isolated only in the deferred lighting pass shader, so that's easier to do.
Or alternatively, *maybe* some sort of multi-camera and/or stencil trickery could achieve what you want, but I have no immediate ideas on how to.
Hey Aras, I was wondering if there's any more information available on Unity CommandBuffers, other than that single blog post and documentation? I'm trying to use the command buffers with individual Lights and not have much luck with the output. I want "clip" parts of a few lights.
So the per-light command buffers are invoked before or after light shadow map rendering (and as such, their functionality is fairly limited; and mostly targeted at "I want to render custom stuff into the shadowmap" use cases).
Not sure what you mean by "clip a part of the light" (artificially "bound" their influence?), but I don't think you can easily do that with command buffers. In forward rendering, you'd have to change the code that actually does lighting directly in the shader; and in deferred shading you'd probably want to implement a "custom light type" that draws your light volume instead of a built-in light volume.
Hi Aras! I noticed textures set to Sprite in Unity don't seem to have an option to handle non-power of 2 textures. Is there some hidden way to do so? Would be great for us to get the advantages of PVCTR and ETC compression for our Android/iOS project.Simon Kratz
I don't know, really. Maybe it's supposed to be there but got removed for some reason? Best probably file a bug or ask on forums, and people working with 2D would know.
Hi Aras! Any info about multi-channel signed distance fields in Unity? Is it going to be implemented? It's great for small UI text/icons! Reference: https://twitter.com/Chman/status/794870701501124612
I don't know the context why @chman was playing around with them, but for the general Font/Text rendering, yes someone is looking at improving the current situation. Bitmap-based glyphs are not really ideal, and indeed some sort of distance field based rendering is one of the options to look into. There are also ways that directly evaluate the glyph bezier curve outlines in the shader, though I'm not sure what are the advantages/disadvantages of that method.
So yeah TLDR: someone is looking into this area, but I don't know their roadmap/plans.