Most temperature scales are linear lines. This means that there are lots of numbers.

It also means that you can specify temperatures fairly easily that cannot possibly exist (e.g. -500°C)

The concept of absolute zero, where there is zero energy in a system, leads to an obvious solution: K, where
there are no negative temperatures but the line leads off into infinity.

Now I believe there must equally be a maximum temperature, to do with quantised energy levels, plancks, beams, joists and secret lap dovetail joints. All we need to do is to find the maximum physically possible temperature and then we set this as 1 on our scale, and we set 0 as absolute zero, and then we can specify any temperature possible using a decimal or fraction between 0 and 1.

I assume this will be inconveniently biassed towards high temperatures - hence the idea of using a log scale kind of thing.

Absolute zero is a theoretical point where there is no molecular motion. It cannot be reached because it would mean defining both the location and speed of a particle. Is absolute hot the theoretical point at which the movement speed of individual particles is the speed of light?

[Voice] //absolute hot// good terminology. But we don't say "absolute cold" so perhaps using my scale, we should call it "absolute one" instead. Also, not sure its as simple as that since photons travel at the speed of light but are not at absolute hot. (I don't think...? Help me out here?)

Temperature is an interesting quantity - strictly speaking, it's a macroscopic aggregate* of a combination of entirely
independent microscopic goings-on. At some point, if the inter-variational** speeds of particles in a substance get up to
the speed of light, that ought to give them a maximal "temperature" in aggregate. Whether those particles could still be
considered particles since they're more likely to disassemble into their constituent bosons, leptons and fermions -
ghostly wave-particle dances which are weird enough for temperature to no longer have a classically meaningful
interpretation.

* If it is an aggregate value, it's likely to be akin to variance. i.e. if a sample of particles were collected, the variance of their different
movement vectors would be greater, the hotter the sample was. No single particle can have a variance, since it's an aggregate quality. But as an
aggregate, it requires there to be an identifiable set of things to include within the set. We might consider variations on variance, for example
the standard deviation as alternate ways of expressing it. There might be some neat ratio form where the standard deviation is expressed in terms
of the standard deviation of particle vectors of some really really hot reference object - say the surface of the event horizon of some black hole
at the centre of a galaxy perhaps. That would give a likely practical upper limit.

** Here "inter-variational" is a made-up-term to help describe the difference between all particles in a substance going in
the same direction at some velocity - and the very different case where all the particles in a substance are going at
entirely different directions all at once. Indeed, if everything in a substance is travelling at the speed of light in all directions at
once, that's a very stretched definition for the term "substance" - except perhaps under extremely constraining
gravitational or electromagnetic force. So perhaps Temperature might be better defined in terms of the force required to
keep some unit of substance at a particular "temperature" in one macroscopic location for long enough for it to be
considered an aggregable thing rather than actively dissipating away as fast as possible in all directions at once.

Suppose there's a linear relationship between joules and
temperature increments for a given element or compound
(specific heat capacity). Suppose there's a theoretical
maximum to the amount of energy you can pump into, say,
hydrogen, before it ceases to be hydrogen. (See the link about
firing lasers into plasma). Suppose that, by the time it's ceased
to be hydrogen, it's also ceased to have any comprehensible way
to have its temperature taken (or even defined). Suppose that
hydrogen is the proper reference substance here, because
atoms get more fissile as they get bigger.

Given all those suppositions, there might be a relevant maximum
related to the specific heat capacity of hydrogen.

Alternatively, maybe the best starting point would be the Big
Bang - or would that give you a theoretical temperature of infinity
Kelvin (coinciding with a finite amount of energy in a volume of
zero)?

You chose the logginess factor to make room temperature a sensible number. Every temperature can be sensible AND the scale can have a rational start and end point.

// It also means that you can specify temperatures fairly
easily that cannot possibly exist (e.g. -500°C) // You
stated this as a problem, but your "solution" seems to me
to make that even worse.

It seems to me there ought to be a way to define the
temperature scale such that negative infinity corresponds
to what we call absolute zero. That seems like it could be
useful since we're never actually gotten there. So set the
scale so some easily achievable low temperature is -1000
and maybe the current record is -1,000,000, and maybe in a
few years or decades someone will get down to -10^9. I
like the current system of positive infinity relating to
something unattainably hot. If scientists do sometime
discover an absolute maximum temperature we can just
recalibrate the scale again... So for a first pass let me
throw out:

Tk = Temperature in Kelvin

Th = Temperature in Halfbakins.

Th = -273.15/Tk + Tk/273.15

That would give us 0C = 0H, but we'd need a scale factor to
make boiling boiling temp = 100, so: