Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
Extruded? Are you sure?

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


             

Nostradam-R-us.com

Perfectly predicted predictions
  (+1)
(+1)
  [vote for,
against]

As someone once said, predictions are difficult, especially about the future.

But.

Buy a domain name. On said domain, create, say, 100,000 pages, each with a non-obvious but plausible name (say, Nostradam-R-us/wegotyour_ predictionsrighthere; /brilliantpredictions; /checkoutthesepredictions). On each page, post a list of predictions for the next five major earthquakes (time ±2 weeks, and approximate location).

Once the next five major quakes have happened, contact the media and give them the URL of the page that carried the correct predictions, and remind them that they can check the page on the Waybackmachine if they want to confirm that the accurate predictions were made long before the events.

Fame and glory follow.

MaxwellBuchanan, May 21 2012

http://xkcd.com/882/ [hippo, May 22 2012]

[link]






       Howabout www.millionmonkeys.org?
RayfordSteele, May 22 2012
  

       This is similar to the scam where a stockmarket 'expert' sends out 2^n letters to potential investors, half stating that shares in a particular company or the stockmarket as a whole will go up, and half that it will go down.
Then they repeat the process with the 2^(n-1) half, until they invest on the basis of the perfect track-record.
  

       As I see it, this idea as stated has two issues:   

       1) 100,000 pages won't be enough for five major earthquakes. Suppose you wish to 'predict' *one* quake, happening some time in the next 10 years. You need 52/4*10=130 different time versions. You also need a location; I suppose you might get away with 200 locations; approximately one for every country (since that seems to be how earthquakes are generally reported). Although some could be merged due to size, some would need to be split. I'm excluding consideration of undersea earthquakes for simplicity reasons.
This means you need 130*200 = 26,000 different pages just to get the first quake right. Assuming it happens within the time-frame.
To get a run of five quakes, one would need 26000^5= 11,881,376,000,000,000,000,000 pages. Probably not going to get away with that one.
  

       2) For your predictions to be verified, your pages must be indexed by the wayback machine. Disregarding (1), this means that any correct hit will be swamped by the archived failures, which will thus reveal your scheme.   

       However, all is not lost.
Regarding (1), the simple solution is to only reveal one earthquake at a time, as in the aforementioned scam, thus avoiding the combinatorial explosion.
Regarding (2), the time-honoured tradition seems to be to make vague predictions which can subsequently be shoehorned into whatever circumstances later transpire.
Loris, May 22 2012
  

       I think this calls for a new and beautifully produced and edited scientific publication "The International Journal of Failed Predictions". This important publication would summarise every failed prediction so that succesful predictions could be set in context and unscruplulous scientific researchers won't be able to pull the trick (linked) of conducting 20 measurements to a p>0.05 confidence level and reporting the one result which turns out positive.

But, I hear you clamour, how will this journal be sure to collect all the failed predictions? Well, I'm glad you asked - before any prediction is made the researcher would have to lodge the prediction with SISIP (Société Internationale pour l'Intégrité Prédictive). If the prediction turns out to be unsuccesful, The International Journal of Failed Predictions will pick it up; If it is succesful it can be published as normal, but no reputable journal will publish a prediction which hasn't been a priori lodged with SISIP.

To provide consistency with this idea, The International Journal of Failed Predictions's website will be "Nostradam-Arse!"
hippo, May 22 2012
  

       //any correct hit will be swamped by the archived failures//   

       Ah, but that was why I suggested giving each page a plausible but non-obvious name, with no links from the root page. Thusfore, I can point someone towards the randomly successful page "Nostradam-R-Us /earthquake_predictions", but they will have a hard time finding "Nostradam-R- Us/ predictorama".
MaxwellBuchanan, May 22 2012
  

       Ah, I'd forgotten that you can persuade wayback machine to cache a page by giving it the URL (apparently this only works once). Meaning that you don't need an external link to each page to get it indexed.
I am still not certain that it's impossible to get a list of urls available under a certain domain. You might be able to remove failures from the archive using their robots.txt policy (perhaps - you might need each file in its own directory). Although presumably you'd then have to wait for the site to be crawled again after the last quake (could be 6 months, which would look suspicious) - and would also take us back to needing external links to every file.
Loris, May 22 2012
  
      
[annotate]
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle