How can the solar panel itself radiate heat when it's being heated up generating supplying power? Looking at pictures of the ISS there's radiators that look like they're there specifically to cool the solar panels.
And even if viable, why would you just not cool using air down on earth? Water is used for cooling because it increases effectiveness significantly, but even a closed loop system with simple dry air heat exchangers is quite a lot more effective than radiative cooling
You take the amount of energy absorbed by the solar panels and subtract the amount they radiate. Most things in physics are linear systems that work like this.
> Context editing automatically clears stale tool calls and results from within the context window when approaching token limits.
> The memory tool enables Claude to store and consult information outside the context window through a file-based system.
But it looks like nobody has it as a part of an inference loop yet: I guess it's hard to train (i.e. you need a training set which is a good match for what people use context in practice) and make inference more complicated. I guess more high-level context management is just easier to implement - and it's one of things which "GPT wrapper" companies can do, so why bother?
This standardization, basically, makes a list of docs easier to scan.
As a human, you have a permanent memory. LLMs don't have it, they have to load it into the context, and doing it only as necessary can help.
E.g. if you had anterograde amnesia, you'd want everything to be optimally organized, labeled, etc, right? Perhaps an app which keeps all information handy.
Everybody wants that, though, no? At least some of the time?
For example, if you've just joined a new team or a new project, wouldn't you like to have extensive, well-organised documentation to help get you started?
I think "internet" needs a shared reputation & identity layer - i.e. if somebody offers a comment/review/contribution/etc, it should be easy to check - what else are their contributing, who can vouch for them, etc.
Most of innovation came from web startups who are just not interest in "shared" anything: they want to be a monopoly, "own" users, etc. So this area has been neglected, and then people got used to status quo.
PGP / GPG used to have web-of-trust but that sort of just died.
People either need to resurrect WoT updated for modern era, or just accept the fact that everything is spammed into smithereens. Blaming AI and social media does not help.
Well, obviously, `npm` has the same destructive power: package might include a script which steals secrets or wipes a hard drive. But people just assume that usually they don't.
Use of common tools like `ls` and file patching is already baked into model's weights, it can do that with minimal amount of effort, leaving more room for actually thinking about app's code.
If you force it to wrap these actions into non-standard tools you're basically distracting the model: it has to think about app-code and tool-code in the same context.
In some cases it does make sense to encourage the model to create utilities for itself - but you can do that without enforcing code-only.
It doesn’t matter if it’s less efficient, what matters is that it has more chances to verify and get it right. It’s hard to rollback a series of tool calls. It’s easier to revert state and rerun a complete piece of code until you get the desired result.
Also true that most tech writers are bad. And companies aren't going to spend >$200k/year on a tech writer until they hit tens of millions in revenue. So AI fills the gap.
As a horror story, our docs team didn't understand that having correct installation links should be one of their top priorities. Obviously if a potential customer can't install product, they'd assume it's bs and try to find an alternative. It's so much more important than e.g. grammar in a middle of some guide.
Have you done a calculation yourself?
reply