Hacker Newsnew | past | comments | ask | show | jobs | submit | andybak's commentslogin

Mudbun renders using raymarching - the video explains why he has avoided doing this.

Although a decent chunk of modern tooling is there to handle the limitations of triangles. And modelling is often using higher-level abstractions that are only turned into triangles at the end of the process.

That's true if you're using a CAD-like tool, but that's typically not used for art (more for engineering / mechanical design)

Game / VFX artists heavily use mesh-based tools such as Maya or Blender.


By coincidence this was released a few weeks ago: https://hothardware.com/news/max-payne-rtx-remix-mod

> Much of the science about lighting, physics, and rendering we take for granted today was mostly unknown;

I'm not so sure. I grew up playing with offline 3d rendering rather than real-time game stuff - and game dev was merely reusing the same smoke and mirrors that people used to keep rendering time under a week a decade earlier. People always knew the "correct" way to do things but it was just out of reach given the hardware constraints. GI, radiosity, path-tracing etc already existed well before this - but nobody could do it on consumer hardware


That scene in Independence Day is seeming less far-fetched every passing moment.

The Jeff Goldblum virus one?

I believe fans have provided a retroactive explanation that all our computer tech was based on reverse engineering the crashed alien ship, and thus the arch, and abis etc were compatible.

It's a movie, so whatever, but considering how easily a single project / vendor / chip / anything breaks compatibility, it's a laughable explanation.

Edit: phrasing


That isn't actually a fan theory, it was actual plot that was cut from the film for time.

Still dumb but not as dumb as what we got.


Reminds me of how in the original the matrix plot the humans were being used for compute power, but the studio execs decided audiences wouldn't understand it.


Gaussian splats


I asked it to do a task that doesn't require spreadsheets but it keeps asking for access to my google drive.


It uses Google Sheets as a "memory layer" for complex workflows to orchestrate multi tab sub agents for example where per row an independent sub agent tab is launched to execute and write back new columns.

We only request drive.file permission so create new sheets or access to ones explicitly granted access to us via Google Drive Picker


That needs to be explained at the point the permission is requested


Skills, plugins, apps, connectors, MCPs, agents - anyone else getting a bit lost?


In my opinion it’s to some degree an artifact of immature and/or rapidly changing technology. Basically not many know what the best approach is, all the use cases aren’t well understood, and things are changing so rapidly they’re basically just creating interfaces around everything so you can change flow in and out of LLMs any way you may desire.

Some paths are emerging popular, but in a lot of cases we’re still not sure even these are the long term paths that will remain. It doesn’t help that there’s not a good taxonomy (that I’m aware of) to define and organize the different approaches out there. “Agent” for example is a highly overloaded term that means a lot of things and even in this space, agents mean different things to different groups.


I liken the discovery/invention of LLMs to the discovery/invention of the electric motor - it's easy to take things like cars, drills, fans, pumps etc. for granted now, and all of the ergonomics and standards around them seem obvious in this era, but it took quite a while to go from "we can put power in this thing and it spins" to the state we're in today.

For LLMs, we're just about at the stage where we've realized we can jam a sharp thing in the spinny part and use it to cut things. The race is on not only to improve the motors (models) themselves, but to invent ways of holding and manipulating and taking advantage of this fundamental thing that feel so natural that they seem obvious in hindsight.


All marketing names for APIs and prompts. IMO you don't need to even try to follow, because there's nothing inherently new or innovative about any of this.


None of them matter that much. They're all just ways to bring in context. Think of them as conveniences.

Tools are useful so the AI can execute commands, but beyond that it's just ways to help you build the context for your prompt. Either pulling in premade prompts that provides certain instructions or documentation, or providing more specialized tools for the model to use along with instructions on using those tools.


They’re all bandaids


Just like C++, JavaScript and every Microsoft product in existence


It reminds me of llm output at scale. Llms tend to produce a lot of similar but slightly different ideas in a codebase, when not properly guided.


It's like JS frameworks. Just wait until a React emerges and get up to speed with that later.


That's funny. My reaction to react emerging was to run away from JS frameworks entirely.


React itself took a few years for react to decide how it should work (hooks not classes etc).


Probably same will follow with LLMs. If you find something that works for you, sorry but that will change.


I'm one of the maintainers of Open Brush (open-source continuation of Google's Tilt Brush) and a huge chunk of our community is in Japan as well as other East Asian countries. The language barrier is really frustrating as I'd love to engage with them more - respond to bug reports, feature requests etc).

(Open Brush can be used to create content for platforms such as VR Chat as well as being a way to create explorable spaces and artworks in it's own right)


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: