h a l f b a k e r yInvented by someone French.
add, search, annotate, link, view, overview, recent, by name, random
news, help, about, links, report a problem
browse anonymously,
or get an account
and write.
register,
|
|
|
This isn't about a computer game, but it is about a computer simulation. If you can think of a better category for this, please tell me.
So here goes:
In a way, it might be said that fighter aircraft are analogous to predatory animals. They are designed to "kill" other aircraft whilst keeping themselves
from being "killed". Could a computer program be used to design new fighter aircraft by utilizing the principle of natural selection?
To elaborate, let's say that some computer program in the future simulates a "population" of 3-D fighter aircraft models. Each fighter aircraft in the simulation has its own artificial intelligence running it, and the goal of each aircraft is to shoot (using either a cannon or a missile) as many other aircraft as it can. Each aircraft also has the goal of not being shot itself.
Now let's get some "genetics" involved. Let's say that the 3-D model of each aircraft is governed by a set of "genes" that determine the proportions and characteristics of each aircraft. For example, there may be a "wingsweep" gene and different "alleles" for this gene may be represented by different wingsweep values (0 degrees, 25 degrees, 40 degrees, etc). Every aspect of each aircraft would have associated "genes", such as fuselage length, number of vertical tailplanes, wing root chord, wing tip chord, etc. Likewise, there would be different "alleles" for each of these genes as well.
As far as natural selection goes, the program would work a little differently than in nature. When one aircraft is shot by another aircraft, it isn't destroyed. Instead, the aircraft that was shot is transformed into a genetic duplicate of the aircraft that shot it. Therefore, the aircraft design that is able to shoot the most enemy aircraft will spread its genetic code the most. Given this, an aircraft design that is successful in hitting foes will be selected for. Conversely, an aircraft that is hit will lose its original genetic code because it would have been transformed into an aircraft of a different genetic code. Thus, aircraft designs that are less able to avoid being shot will be selected against.
In order to make this program more useful, it's time to throw in the wild card: genetic mutation. Perhaps every once-in-a-while one aircraft shoots an enemy aircraft but the enemy aircraft isn't transformed into an EXACT clone of the winning aircraft. Instead, one of its its genes is modified slightly. For example, let's say that the winning aircraft had an "allele" that coded for a wingsweep of 45 degrees, but the computer program caused a "mutation" to that "allele" so that the resulting aircraft's wingsweep is 47 degrees instead of 45 degrees.
If this new aircraft turned out to be more "fit" (successful) than its parent (i.e. it is better able to avoid being shot, or can shoot others more easily), then that design will be selected for in the program over the original design.
Lethal mutations might also occur (as an example, an aircraft's wingspan might be mutated so that they are too small for it to fly). However, these mutations would be selected against and therefore purged from the population. In order to compensate for the crash of an aircraft, we could say that the computer program would randomly generate a new aircraft in its place that is a clone of an existing aircraft. That way, the number of aircraft in the simulation would stay constant and we wouldn't have to worry about total extinction.
Given enough time, many different successful aircraft "species" may emerge which exploit different niches. For example, there may be some designs that can fight well at high altitudes, but not at low altitudes. The inverse of that situation may also be true. Likewise, different strategies may emerge. Some aircraft may rely on a highly agile airframe to avoid being shot, whereas some other designs may rely on a stealthy, low radar cross-section to avoid the same fate (if radar is included in the simulation, that is).
I suspect, of course, that simulating such a large number of aircraft using realistic flight physics and artificial intelligence whilst keeping track of each aircraft's "genetic make-up" would require a very powerful computer. Whether or not this could be done with current computers, I don't know. Maybe some time in the future when AI is more fully developed this program might become more plausible.
I think this could be a potentially useful concept, given a computer with enough power.
EDIT: I would like to make it clear that this is NOT a game. It is supposed to be simulation program run ONLY by the computer itself. It's not the AI that's supposed to evolve, it's the aircraft themselves. The purpose of the program is to evolve increasingly efficient aircraft designs in a similar way that nature can evolve increasingly efficient animals to exploit their resources.
I don't know anything about programming myself. This is just a set up to describe the basics of how the simulation might run. If you wanted to, perhaps you could even create "digital genes" to describe even the tiniest details of each aircraft such as the size and number of turbine blades in each engine, the fuel injector configuration, the hydraulic fluid pressure, etc. Getting that detailed would required WAY more computer power than simply evolving an airframe, though.
I'm not exactly sure at this moment how a "sexual selection" element could be incorporated into the program. I could definitely see benefits in it, though. I mean, if an aircraft with a very low stall speed "mates" with an aircraft that is highly agile, the resulting offspring might be better than either of the parent aircraft alone.
EVOLVING COMPUTERS!
http://theamazingfr...evolving-computers/ This is SOMEWHAT similar to my concept, but it evolves computer chips instead of aircraft. [Kryptid, Dec 04 2007]
[link]
|
|
Surely you'll just get a plane that physically "maps" any quirks or deficiencies in the AI program? |
|
|
I suppose you could market it as a flight simulator program, and, in a similar way to the SETI at home project, use millions of users to fly the evolving planes... |
|
|
You'll just end up with the plane that best compensates for the deficiencies in the AI. You could market it as a game, but millions of computer game jockies won't tell you what a real pilot will do. Also the only way to compensate for differing levels of ability and gamers preferring different planes is to force the gamers to switch planes every time. Still better than an AI though, especially if the game has extremely realistic physics and pilot limitations like time in air, G-force troubles, point of view limitations, communications troubles, and realistic controls. Also you can't let the algorithm improve itself by chopping off the cockpit, decreasing flight time or durability, or making maintanence more difficult... |
|
|
Wow... That is an eyeful, but anything to do with ai etc. gets an auto-tick [+]
oh and creating an evolving ai to go with this so you can do away with the pilot altogether would also be a good idea. Also you can get actual data on how real pilots fly and incorporate that into the ai. Also, you could even get real pilots to fly these planes too. Also, I like saying also when i'm tired |
|
|
I like it - lots of thought has gone into this - but as someone who's thought about "genetic" algorithms myself, it isn't the potential benefits that need describing (you might as well have an all encompassing idea "Genetic Algorithm for Designing X") - it's the nuts and bolts of the programming that's missing. |
|
|
For example, how do you encode the 'DNA'? Are we only talking about death selection, or will there be a sexual selection element as well (much more powerful in terms of evolution, though occasionally producing the odd quirk - us included) and if so how would this be modelled? |
|
|
I fear that transmitting your code onto an object you've just destroyed might end up looking like a computerised game of "British Bulldog" - kind of like mass "it". It wouldn't take long before everything starts looking the same. Mutation alone, won't be enough to find global maxima - you need to create niches in your environment - there are other things that need to be done other than shooting down other hostile planes. Refuelling, cargo, VTOL, long-range sensors, reconnaisance, bombing - it's not all air superiority. There are different mission niches that the nation at arms has to fill. In this respect, I'd suggest that planes are more akin to organs than organisms. |
|
|
Anyway, all of these factors would also need to be introduced to the simulation. I think you'd need to figure out a better way to determine whether something gets to pass its genes on - perhaps through some kind of cost/benefit analysis - kills per $ - after all, someone's going to have to manufacture these things. Which is something else you'll have to factor in, otherwise, you could potentially end up with a zillion-billion dollar mega-plane that nobody can afford to make, but which might still be easily shot-down using millions of 100 dollar planes armed with one bullet each! |
|
|
Fascinating concept. Smart thinking tossing in the AI operating the aircraft; the human element would be entirely too random a factor. |
|
|
I imagine you would have separate simulations running for separate mission profiles; it'd be silly to ask your heavy bomber to engage ground support fighters, for instance. |
|
|
Air supremacy in my estimation begins with offensive capability and ends with survivability. The first requires a wide array of missiles, bombs and close-in weaponry - all of which are under constant development and not really in the scope of this idea - and the latter requires stealth and maneuverability. Already fighter aircraft are so dynamically unstable that they cannot be flown without significant computer assistance - this is a very large part of what makes them so maneuverable in combat. You'd come away with an aircraft that was operated entirely by the computer, while a human might - or might not - be aboard for on-the-spot mission control and larger decision making processes. You could just as easily have the operator on the ground, out of harm's way. |
|
|
I've seen this done (on TV) for wing
shape, but in hardware in parallel with
the software. If I remember correctly,
the wings were automatically sculpted
by CAM, evaluated for performance in a
wind tunnel, and then the most
successful was used as the basis for the
next generation, with random
mutations. I think it worked, as you
would expect it to. |
|
|
I foresee three types of problem with
using an entirely computer-bound
simulation to evolve fighter aircraft.
One: you're entirely dependent on the
simulation's accuracy; this will be fine
up to a point, but not completely. Two:
you're dependent on the computer to
'fly' each plane - a real pilot will
intuitively know how to best exploit a
given aircraft; in your scenario, you are
'evolving' the pilot as well, which can't
then be easily translated into reality.
Three: long experience (for example,
in-vitro selection strategies in
molecular biology) shows that it is
almost impossible to select for a very
complex trait. What happens in practice
is that the system evolves a strategy
which works according to the rules
you've imposed, but doesn't do what
you want. For example, you might find
that you 'evolve' an aircraft that burrows
into the ground with only its gun turrets
showing, or one which escapes by
pulling 30-G turns that would kill a real
pilot. |
|
|
Evolutionary strategies are very
effective, but they are limited by the
degree of realism of the selection, by
real physical constraints, by local but
suboptimal 'fitness peaks', and by a
host of other factors. |
|
|
Not bad. I particularly like lostdog's idea to get human gamers into the mix. |
|
|
What you describe is known as a genetic algorithm, and it is a commonly-used optimization algorithm. |
|
|
Appreciate that in practice, genetic algorithms are a little finicky if you want them to converge to good solutions, and there's a pretty appreciable about of human tuning that goes into using them. In real life, other less-sexy optimization algorithms are often a better choice. Simulated Annealing gets many of the benefits of genetic algorithms while being more predictable -- though still a little finicky -- and other, even older deterministic algorithms, like Nelder-Meade and Levenberg-Marquardt, are often the best choice. Also, one of the newest optimization algorithms around is Particle Swarm Optimization; it is supplanting Genetic Algorithms in many places. |
|
|
(If you're curious, Wikipedia has good articles on all of the above.) |
|
|
I want an airplane that launches Chinese stars that spin so fast that it creates turbulence... I think that would be a good weapons system. |
|
|
This is an interesting approach, but seems like a complicated way to solve an easier problem. The dynamics to produce an efficient plane seem to be pretty well-known. Exactly how to program this thing to create a winner given all of the complicated dynamics of flight and fighting would be an immense task. |
|
|
If planes are more akin to organs than organisms, then create teams. Heck, expand on this and develop whole wargames involving planes, armies, ships, tactics, etc. |
|
| |