Aras Pranckevičius
Latest answers

Hi Aras, What resources can you recommend for graphic programmers? Just wondering where my go to places should be to find quality material when trying to implement new feature X in my game. realtimerendering.com, dl.acm.org, GDC Vault,...?

Elvar Orn Unnthorsson

Basically the above, yeah.

"Advances in real-time rendering" siggraph course has some awesome material each year: http://advances.realtimerendering.com/

"Pysically based shading" course at the same siggraph is also good material: http://blog.selfshadow.com/publications/

Books like http://www.realtimerendering.com/book.html and http://www.pbrt.org/ are very useful too

View more

Hello Aras! What's your favorite game/s made in Unity?

Jacob Smaga

In last year, probably INSIDE and Firewatch.

I loved Monument Valley, TIS-100, Year Walk in the year before.

View more

Hi Aras, I am working on a system that uses std::hash C++ (based on murmur?). I am experimenting to use different hash functions, and tried cityhash64 and xxhash64, also std::hash for integer keys and cityhash64 for string. However, cityhash is best for overall CPU. Any suggestions? Thx! Jing

Depends on what you want & what your needs are, how long are the typical keys, what platform you're on etc. In some tests that I did (http://aras-p.info/blog/2016/08/09/More-Hash-Function-Tests/), CityHash64 was looking very good indeed. If it works for you, great, use it!

View more

Someone asked about GDC so my question is: Is there going to be any live where people can watch Unity's presentations?

The keynote on Feb 28 should be live here https://unity3d.com/gdc2017 -- not sure about other talks. I guess the others might not be live streamed, but recorded.

View more

Hi Aras, Is ScriptableRenderLoop/HDRenderPipeline going to be shown at GDC on "The Future of Rendering in Unity" presentation?

I do think the concept & design of "scriptable render pipelines" is absolutely the focus of that talk overall. But don't know the details, it's not me who's doing the talk.

View more

Hello Aras! Is ScriptableRenderLoop/HDRenderPipeline going to be shipped in Unity 2017.x?

That's the plan, yes.

You can already use an older version of it (at your own risk) in current Unity 5.6 betas; the github project contains information on which revisions of it should be used.

Basic things are already working fairly well; the largest piece of the missing puzzle is nice UX for all the things. For example, how built-in scene view rendering debug modes should interact with debug modes provided by the HD render pipeline itself. Where various settings should be put in the UI. etc. etc.

View more

Hi Aras, since somebody asked about Volumetric Clouds. Do you happen to know of any good real-time approaches that is capable of rendering clouds from afar as well as up close? Looking for a solution where I have a vehicle on the ground that could jump up to the sky and fly through clouds.

Elvar Orn Unnthorsson

Nothing comes to mind right now, but I have not been following that area.

My guess is that "flying through clouds" and "clouds in the distance" likely need somewhat different systems/approaches for rendering.

"The Real-time Volumetric Cloudscapes of Horizon: Zero Dawn" from http://advances.realtimerendering.com/s2015/index.html I remember being fairly interesting, but I forget whether they handled "flying through clouds" case.

"A Novel Sampling Algorithm for Fast and Stable Real-Time Volume Rendering" from the same siggraph course might be useful for cloud rendering part too.

View more

Hi Aras, I've found this https://docs.google.com/document/d/1e2jkr_-v5iaZRuHdnMrSv978LuJKYZhsIYnrDkNAuvQ/edit and in the “HD Render Loop” section there is info about "Volumetric Lighting" and "Sky/fog atmospheric scattering model.". Does it includes Volumetric Clouds? Thank you and greets!

Various volumetric things (sky, clouds, scattering etc.) are on the roadmap, but not very much near "top" of it, and timelines are uncertain.

View more

Just noticed my color ID texture gets blurry when overriding its max size in the Asset Importer. Setting filter mode to Point doesn't help. Is there any way to get a smaller resolution and keep hard pixel edges? Currently I resize it in Photoshop with Resampling set to Nearest Neighbor :)

Simon Kratz

Yes, today clamping max texture size in unity always downsamples the texture with something like a Mitchell filter, and ignores the GPU filtering settings. So you have to do that externally. Maybe worth filing a bug so that someone would remember to fix it one day.

View more

When can we expect good looking/not blurred UI Text in Unity? Greets!

I don't know, but people keep on asking this question :) ...to which I still answer with "I don't know"... https://ask.fm/aras_pr/answers/142086711404

View more

Do you think Microsoft is trying force people into UWP and turn Windows into a closed platform? Do you anticipate Unity will ever have UWP version but will circumvent the windows store?

Salim

I don't know wrt Microsoft's plan. Personally, I don't pay much (or any?) attention to UWP. Never used the Windows Store.

Unity already can build apps for UWP of course, as well as regular Win32/64 apps. Unity editor itself is a Win32/64 application, and I don't see it becoming an UWP app anytime soon. Or reasons to do that.

View more

What do you think about Otter Browser?

Never heard of it :)

View more

Hello Aras! What's the situation on new Terrain System? Thanks!

It still has not shipped :(

View more

Hi, Aras? Thank you for your work. I have a question about GLSL. I need to build Ogre with this library, but shared. I recompiled the package with the flag -fPIK but cmake not see these libs, even though I made a symbolic link, and run ldconfig. Please tell me how I can do glsl optimizer shared.

I'll assume you want to integrate glsl-optimizer into Ogre...

However, since you're talking about -fPIK and cmake, I'll also assume this is about a platform that I know nothing about (Linux by chance?). So, uhh... no idea. I know how to build things on Windows and macOS, but Linux I have zero knowledge about. glsl-optimizer itself is just bunch of C++ code that needs to be compiled, without any special things done for it. So "whatever Linux people do to build dynamic libraries on Linux" is the best answer I can give :)

View more

Hi Aras! Is there any way to get Splash Screen like this one? 0-0:30 https://www.youtube.com/watch?v=9_YJnor4XH4 . Greets!

You'd have to implement that splash screen as a separate "level" of the game, with all the effects that it does, and make it the first level in the game. Start all the effects, and start loading the "actual level" in the background, using LoadSceneAsync (https://docs.unity3d.com/ScriptReference/SceneManagement.SceneManager.LoadSceneAsync.html).

View more

Hello Aras! I'm working on a game where we are displaying a bunch of small/big texts, the problem is that small text is blurred/ugly. Is anyone from the UI team working on a better Font/Text rendering? Greets!

Sounds pretty much like this other question a few days ago. http://ask.fm/aras_pr/answers/142045215852

View more

re: command buffers Example: I have a point light. I have a second sphere. I only want the objects inside the second sphere to be effected by the point light.

Yes, so what I wrote before, and the solution depends on whether you're using forward shading or deferred shading.

In forward shading, you'd have to do sphere check inside the shader part that evaluates the point light, and don't add any illumination outside of that sphere's range. This means all the shaders of all objects in the scene need to know about this, an be modified to do this. In deferred shading, the change would be isolated only in the deferred lighting pass shader, so that's easier to do.

Or alternatively, *maybe* some sort of multi-camera and/or stencil trickery could achieve what you want, but I have no immediate ideas on how to.

View more

Hey Aras, I was wondering if there's any more information available on Unity CommandBuffers, other than that single blog post and documentation? I'm trying to use the command buffers with individual Lights and not have much luck with the output. I want "clip" parts of a few lights.

So the per-light command buffers are invoked before or after light shadow map rendering (and as such, their functionality is fairly limited; and mostly targeted at "I want to render custom stuff into the shadowmap" use cases).

Not sure what you mean by "clip a part of the light" (artificially "bound" their influence?), but I don't think you can easily do that with command buffers. In forward rendering, you'd have to change the code that actually does lighting directly in the shader; and in deferred shading you'd probably want to implement a "custom light type" that draws your light volume instead of a built-in light volume.

View more

Hi Aras! I noticed textures set to Sprite in Unity don't seem to have an option to handle non-power of 2 textures. Is there some hidden way to do so? Would be great for us to get the advantages of PVCTR and ETC compression for our Android/iOS project.

Simon Kratz

I don't know, really. Maybe it's supposed to be there but got removed for some reason? Best probably file a bug or ask on forums, and people working with 2D would know.

View more

Hi Aras! Any info about multi-channel signed distance fields in Unity? Is it going to be implemented? It's great for small UI text/icons! Reference: https://twitter.com/Chman/status/794870701501124612

I don't know the context why @chman was playing around with them, but for the general Font/Text rendering, yes someone is looking at improving the current situation. Bitmap-based glyphs are not really ideal, and indeed some sort of distance field based rendering is one of the options to look into. There are also ways that directly evaluate the glyph bezier curve outlines in the shader, though I'm not sure what are the advantages/disadvantages of that method.

So yeah TLDR: someone is looking into this area, but I don't know their roadmap/plans.

View more

Hello Aras! Is Uniity going to add Volumetric Clouds/Atmospheric Scattering in Unity 2017.x ?

It's not in the "must do immediately" plans, but it is on the "items to do" lists somewhere. Whether that will make it in 2017, we'll see. "Improve atmosphere rendering situation" is something we will try to, well, improve.

View more

Hi Aras! I wonder if you have information on how to use SV_RenderTargetArrayIndex in unity rendering pipeline. Bascially, I'm looking for a way to call an equivalent of Graphics.SetRenderTarget, but with multiple slices set. Thanks!

Space Toad

-1 for depthSlice argument of Graphics.SetRenderTarget should bind the whole array or 3D resource. Then you can use SV_RenderTargetArrayIndex.

Note that not all graphics APIs support that, e.g. OpenGL ES does not.

View more

So...Can i consider appdata_full struct is 8 interpolators?

(this seems to be reference to http://ask.fm/aras_pr/answers/141577610604 -- gosh ask.fm is terrible for discussions :))

"appdata*" structures in standard Unity shader includes are for vertex shader inputs. The vertex-to-fragment structures are typically named "v2f*". The interpolators/varyings talk is about the latter.

View more

Hello, Aras.I'm a Chinese Developer.Maybe my English is poor,Please forgive me. -qusetion : In the Unity Shader Official Document. target 2.0 support 8 interpolators. What's the mean of interpolators. How do i understand it??

"interpolators" is a DirectX term, for example in OpenGL they are called "varyings" sometimes. It's basically the "things" you pass from the vertex shader into the pixel/fragment shader. In shaders these typically have "TEXCOORDn" semantics on them.

Platforms like OpenGL ES 2.0 are often limited to 8 four-component vectors (so in total up to 32 numbers) that can be written to from the vertex shader and read by the fragment shader. DirectX9 shader model 2 (SM2.0) is slightly more complicated, as it allows up to 8 TEXCOORDn interpolators (each being float, float2, float3 or float4), and additionally two low-precision COLORn interpolators (again each being float..float4).

Later shader models/APIs do away with that split between "texcoord vs color" interpolators, and e.g. DirectX9 shader model 3 says "up to 10 float..float4 interpolators". OpenGL ES 3.0 and DirectX10 says "up to 16" etc.

View more

As an aspiring gfx dev, I always wanted to ask this -stupid?- question to a real one. In most AAA games, colors look heavily saturated (Uncharted, FF15, for instance). Is it an artistic decision? Or is it because something missing in the lighting step? (absorption,GI?) Or is it just my eyes ? :) Thx

I think it's mostly artistic direction. Some AAA games go for high saturation. Some go for brown/gray look, or used to a few years ago -- see this half-joke http://www.codersnotes.com/notes/two-channel/ Some go for orange+teal lighting setups. etc. :)

View more

Loading…

Ask @aras_pr:

About Aras Pranckevičius:

Graphics programmer and code plumber at Unity

Kaunas, Lithuania

http://aras-p.info/

#graphics #programming