@aras_pr

Aras Pranckevičius

Ask @aras_pr

Sort by:

LatestTop

Just noticed my color ID texture gets blurry when overriding its max size in the Asset Importer. Setting filter mode to Point doesn't help. Is there any way to get a smaller resolution and keep hard pixel edges? Currently I resize it in Photoshop with Resampling set to Nearest Neighbor :)

Klonkimon’s Profile PhotoSimon Kratz
Yes, today clamping max texture size in unity always downsamples the texture with something like a Mitchell filter, and ignores the GPU filtering settings. So you have to do that externally. Maybe worth filing a bug so that someone would remember to fix it one day.
Liked by: Simon Kratz

Do you think Microsoft is trying force people into UWP and turn Windows into a closed platform? Do you anticipate Unity will ever have UWP version but will circumvent the windows store?

Salim
I don't know wrt Microsoft's plan. Personally, I don't pay much (or any?) attention to UWP. Never used the Windows Store.
Unity already can build apps for UWP of course, as well as regular Win32/64 apps. Unity editor itself is a Win32/64 application, and I don't see it becoming an UWP app anytime soon. Or reasons to do that.
Liked by: Salim

People you may like

Hi, Aras? Thank you for your work. I have a question about GLSL. I need to build Ogre with this library, but shared. I recompiled the package with the flag -fPIK but cmake not see these libs, even though I made a symbolic link, and run ldconfig. Please tell me how I can do glsl optimizer shared.

I'll assume you want to integrate glsl-optimizer into Ogre...
However, since you're talking about -fPIK and cmake, I'll also assume this is about a platform that I know nothing about (Linux by chance?). So, uhh... no idea. I know how to build things on Windows and macOS, but Linux I have zero knowledge about. glsl-optimizer itself is just bunch of C++ code that needs to be compiled, without any special things done for it. So "whatever Linux people do to build dynamic libraries on Linux" is the best answer I can give :)
Liked by: Igor

Hi Aras! Is there any way to get Splash Screen like this one? 0-0:30 https://www.youtube.com/watch?v=9_YJnor4XH4 . Greets!

You'd have to implement that splash screen as a separate "level" of the game, with all the effects that it does, and make it the first level in the game. Start all the effects, and start loading the "actual level" in the background, using LoadSceneAsync (https://docs.unity3d.com/ScriptReference/SceneManagement.SceneManager.LoadSceneAsync.html).

re: command buffers Example: I have a point light. I have a second sphere. I only want the objects inside the second sphere to be effected by the point light.

Yes, so what I wrote before, and the solution depends on whether you're using forward shading or deferred shading.
In forward shading, you'd have to do sphere check inside the shader part that evaluates the point light, and don't add any illumination outside of that sphere's range. This means all the shaders of all objects in the scene need to know about this, an be modified to do this. In deferred shading, the change would be isolated only in the deferred lighting pass shader, so that's easier to do.
Or alternatively, *maybe* some sort of multi-camera and/or stencil trickery could achieve what you want, but I have no immediate ideas on how to.

Hey Aras, I was wondering if there's any more information available on Unity CommandBuffers, other than that single blog post and documentation? I'm trying to use the command buffers with individual Lights and not have much luck with the output. I want "clip" parts of a few lights.

So the per-light command buffers are invoked before or after light shadow map rendering (and as such, their functionality is fairly limited; and mostly targeted at "I want to render custom stuff into the shadowmap" use cases).
Not sure what you mean by "clip a part of the light" (artificially "bound" their influence?), but I don't think you can easily do that with command buffers. In forward rendering, you'd have to change the code that actually does lighting directly in the shader; and in deferred shading you'd probably want to implement a "custom light type" that draws your light volume instead of a built-in light volume.

Hi Aras! I noticed textures set to Sprite in Unity don't seem to have an option to handle non-power of 2 textures. Is there some hidden way to do so? Would be great for us to get the advantages of PVCTR and ETC compression for our Android/iOS project.

Klonkimon’s Profile PhotoSimon Kratz
I don't know, really. Maybe it's supposed to be there but got removed for some reason? Best probably file a bug or ask on forums, and people working with 2D would know.
Liked by: Simon Kratz

Hi Aras! Any info about multi-channel signed distance fields in Unity? Is it going to be implemented? It's great for small UI text/icons! Reference: https://twitter.com/Chman/status/794870701501124612

I don't know the context why @chman was playing around with them, but for the general Font/Text rendering, yes someone is looking at improving the current situation. Bitmap-based glyphs are not really ideal, and indeed some sort of distance field based rendering is one of the options to look into. There are also ways that directly evaluate the glyph bezier curve outlines in the shader, though I'm not sure what are the advantages/disadvantages of that method.
So yeah TLDR: someone is looking into this area, but I don't know their roadmap/plans.
Liked by: Jacob Smaga

Hello Aras! Is Uniity going to add Volumetric Clouds/Atmospheric Scattering in Unity 2017.x ?

It's not in the "must do immediately" plans, but it is on the "items to do" lists somewhere. Whether that will make it in 2017, we'll see. "Improve atmosphere rendering situation" is something we will try to, well, improve.

Hi Aras! I wonder if you have information on how to use SV_RenderTargetArrayIndex in unity rendering pipeline. Bascially, I'm looking for a way to call an equivalent of Graphics.SetRenderTarget, but with multiple slices set. Thanks!

Space Toad
-1 for depthSlice argument of Graphics.SetRenderTarget should bind the whole array or 3D resource. Then you can use SV_RenderTargetArrayIndex.
Note that not all graphics APIs support that, e.g. OpenGL ES does not.

So...Can i consider appdata_full struct is 8 interpolators?

(this seems to be reference to http://ask.fm/aras_pr/answers/141577610604 -- gosh ask.fm is terrible for discussions :))
"appdata*" structures in standard Unity shader includes are for vertex shader inputs. The vertex-to-fragment structures are typically named "v2f*". The interpolators/varyings talk is about the latter.

Hello, Aras.I'm a Chinese Developer.Maybe my English is poor,Please forgive me. -qusetion : In the Unity Shader Official Document. target 2.0 support 8 interpolators. What's the mean of interpolators. How do i understand it??

"interpolators" is a DirectX term, for example in OpenGL they are called "varyings" sometimes. It's basically the "things" you pass from the vertex shader into the pixel/fragment shader. In shaders these typically have "TEXCOORDn" semantics on them.
Platforms like OpenGL ES 2.0 are often limited to 8 four-component vectors (so in total up to 32 numbers) that can be written to from the vertex shader and read by the fragment shader. DirectX9 shader model 2 (SM2.0) is slightly more complicated, as it allows up to 8 TEXCOORDn interpolators (each being float, float2, float3 or float4), and additionally two low-precision COLORn interpolators (again each being float..float4).
Later shader models/APIs do away with that split between "texcoord vs color" interpolators, and e.g. DirectX9 shader model 3 says "up to 10 float..float4 interpolators". OpenGL ES 3.0 and DirectX10 says "up to 16" etc.

View more

As an aspiring gfx dev, I always wanted to ask this -stupid?- question to a real one. In most AAA games, colors look heavily saturated (Uncharted, FF15, for instance). Is it an artistic decision? Or is it because something missing in the lighting step? (absorption,GI?) Or is it just my eyes ? :) Thx

I think it's mostly artistic direction. Some AAA games go for high saturation. Some go for brown/gray look, or used to a few years ago -- see this half-joke http://www.codersnotes.com/notes/two-channel/ Some go for orange+teal lighting setups. etc. :)
Liked by: Seon Rozenblum

Hi Aras! We are using Unity native rendering plugin for our recently released game "Mushroom Wars 2" and seems there is an issue in Unity 5.x for iOS methods UnityPluginLoad and UnityPluginUnload never happen. This cause some issues for us. Is this a known issue and any plans to fix it? THANX

megaDRONe86’s Profile PhotoAndrey Korotkov
Sounds like the same question as in http://ask.fm/aras_pr/answers/141365091180 -- TLDR I know nothing about iOS plugins, but I know they are different from most other platforms "somehow"

Hey Aras, wrote you before, but I am not sure if you received my message because my account was not validated. Can you tell me how you received the geometry on the picture of step 4 from your earring project. I am desperately looking for a solution to get this effect on smooth geometry. THANKS

I forgot exact steps, so I'd go with what I have written in that blog post (http://aras-p.info/blog/2012/12/14/adventures-in-3d-printing/) -- I think I did Filters -> Remeshing -> Surface Reconstruction: Poisson in MeshLab.

Hi Aras. I saw the project you did with the earrings. i am very interested in what kind of values you used for the marching cubes in meshlab. I want to transform a smooth geometry into a geometry with these kind of steps. Even rougher. Looking forward to your response thank you very much Best Alex

I actually forgot the details, this (http://aras-p.info/blog/2012/12/14/adventures-in-3d-printing/) was some years ago and I have not written them down apparently.
I think the marching cubes I did myself, just some simple C++ code to do it (but now I can't find it, apparently did not save/push anywhere?). Then did additional smoothing in MeshLab, since my raw marching cubes was a very noisy mesh.
Sorry, but I really forgot the details :/

What do you think about Unreal Engine's decision to use C++ as the game logic language? What is your general opinion of managed vs unmanaged languages in game dev context?

Ivan Poliakov
I think every approach has pros & cons.
In UE4 case, they seemingly have a split of "high level logic should be in Blueprints, low level logic in C++". That's a valid approach, though personally to me it feels that the downside is that there's no "middle" -- you either need to get to C++ level (many can't or don't want to), or you need to work with Blueprints (many can't or don't want to).
In Unity's case, it's mostly about this "middle level" (C#/.NET), however indeed the things we lack are these "on the edges" (super high level, visual programming for people who don't want to or can't program; and super low level scripting for people who need to get there). While each can be worked around (via plugins, or visual scripting extensions), indeed it's not ideal right now.
I think managed languages are fine for a lot of game code. They do have some downsides (garbage collection is probably the major one), but on the other hand, game scripting has been using some sort of "higher level languages" for a very long time by now (e.g. Lua, C#, Python, UnrealScript, other custom languages). In particular Unity's case, the GC situation is not ideal; I think once we get a more modern GC things should be a bit better.
So yeah, basically different approaches, and each of them has some advantages & disadvantages.

View more

Liked by: Seon Rozenblum

Having been working on the graphics of a multi-platform engine, do you think supporting Windows is difficult enough that when a AAA game has horrible performance on day one it is not completely unjustified?

I don't think it's so much of "supporting windows", but more towards "supporting wide range of hardware/software configs".
For big AAA games, usually (not always) most of the revenue comes from consoles. So it's natural that this is where most of the effort goes into, and most of the optimizations, and most of quality assurance.
Now, pretty much everyone also has the game running on the PC in some state all the time - after all, all the development tools are on PC, and so on. These days, with both PC & consoles even having very similar hardware (no "exotic hardware" like Cell/360) that's even easier. However, a PC has a ton of things that you don't have to worry about on consoles - various numbers of CPU cores and speeds (with unknown amount of that taken by other applications & background processes), all the different GPUs out there and their various driver versions, unknown amount & speed of memory and storage, etc. etc.
Getting most of that working acceptably is usually not rocket surgery, but requires quite some QA and then some amount of development work to fix or work around the problems that are uncovered. Game development timelines often do not leave "extra time at the end" for PC optimization -- up until humanly possible, the teams usually try to make the best game they can on the main platforms (this being consoles), and then they ship. And then once that is done & shipped, they turn into "oh we should do some QA & fixes for PC" I think -- that's just a natural course of things with consoles being the major money bringers.

View more

Liked by: Seon Rozenblum

now that putin will seize baltic states, what country do you consider to emigrate to ? (asking for a friend)

R.Pole
I really hope it won't come to that! But yeah, you never know :(
Places that I'd like to live in: Iceland, New Zealand. Well, I have no idea if I'd really like to live there, but both are kinda remote and have some awesome nature. Sounds like a good deal to me.

Next

Language: English