Half a croissant, on a plate, with a sign in front of it saying '50c'
h a l f b a k e r y
Naturally low in facts.

idea: add, search, annotate, link, view, overview, recent, by name, random

meta: news, help, about, links, report a problem

account: browse anonymously, or get an account and write.

user:
pass:
register,


                                     

System Introduction

A 3rd party trust system
  (+1)
(+1)
  [vote for,
against]

Security protocols like SSL have a 3rd party trust system. When you pull up an SSL website, your browser goes to a trusted 3rd party to verify the identity claims of the website you initially contacted. The trusted 3rd party will then tell you if the server is actually who it claims to be.

The social analogue of this is you calling the Better Business Bureau after you've made contact with some business. The BBB and your business only have a cursory relationship.

Another way of creating trust relationships is through introduction. If you are introduced to someone through a good friend, you are likely to trust that person more than someone you meet on the street, even if you've read about that person in the newspaper.

So, applying that model to the computer world, you can initiate trust relationships through 'introduction':

You: I need to find a site that offers X service.

Trusted System Z: I know a site that serves X. Let me introduce you to Y.

Y: Hello, I'm Y.

You: Hello, Y. I understand you know a friend of mine.

Y: Yes, I know trusted system Z.

You: You gave me the name of a friend we have in common. Can you ask that friend for a shared password?

Y: Certainly.

Trusted System Z: Here's a password that I just made up and gave to Y: "abracabra".

You: Y, Trusted System Z just gave me a password. Can you tell me what it is?

Y: Certainly. It is "abracabra".

You: It looks like I can trust you, Y. Do you have X service?

Y: Yes I do .

With this system , you can measure how well you trust a system by how close they are to you (how many introductions it took them to meet you), how many links they are away from you, etc.

Also, if a system makes bad introductions to you, you can degrade their trust rating.

lawpoop, Sep 25 2003

PGP "web of trust" explained http://www.heureka..../sunrise/pgpweb.htm
[krelnik, Oct 04 2004]


Please log in.
If you're not logged in, you can see what this page looks like, but you will not be able to add anything.



Annotation:







       isn't this the way some search engines work? not for business, obviously, but to assess the validity of a site which has otherwise looked good in keyword trawls?
badgers, Sep 25 2003
  

       This reminds me of the "web of trust" concept in PGP (Pretty Good Privacy). In fact, I think what you are proposing could be implemented using signed PGP keys instead of simple passwords, so you wouldn't require direct interaction with the third parties.
krelnik, Sep 25 2003
  

       Yeah, I think google (when it first came out) ranked a site by the number of sites linking to them. They've probably modified their strategy some by now.
lawpoop, Sep 25 2003
  

       Yeah, simple passwords are a bad idea. Keys or something else I don't fully understand would be better. I'll leave it up to the experts.
lawpoop, Sep 25 2003
  

       "I'm sorry, your password of abacabra does not match my password. I'm afraid I'll have to report you to the typo police."
RayfordSteele, Sep 25 2003
  

       I've considered this as a means of personal authentication via the web.   

       It might work if it were done semi-anonymously. That is, both parties enter some uniquely identifying information to begin the process. Then a query engine tries to create a string of trust between the two.   

       I like my way better because there's no need to disclose who trusts whom. With my way, I just submit a list of the people I trust. No one knows whose name I submitted, I don't know who submitted my name. That improves the quality of the connection, in my opinion, because my anonymity allows me to associate myself only with people I completely trust.   

       At the end of a successful transaction (whatever that might be), feedback could be given to the system reinforcing its selection and that particular chain of trust. Negative feedback could do the opposite.
phoenix, Sep 25 2003
  

       Pheonix, how does your system work? Isn't a unique identifier the *exact opposite* of an anonymous system?
lawpoop, Sep 26 2003
  

       I'm sorry, I'm sure you will explain it better in the next post, but right now it sounds hilarious.   

       Pheonix: I'm looking for service Y.   

       System V: I have service Y. What's your unique ID?   

       Pheonix: a7d9a99d92332498d734987cce908098e.   

       System V: a7d9a99d92332498d734987cce908098e?   

       Phoenix: That's what I said, dipshit.   

       System V: Aren't you the system that never shipped my jolt cola for that auction I won and paid you for?   

       Phoenix: I don't think so... Someone else.   

       System V: Another a7d9a99d92332498d734987cce908098e?   

       Phoenix: ... right.   

       System V: Another system with a unique ID of a7d9a99d92332498d734987cce908098e?   

       Phoenix: Bye!
lawpoop, Sep 26 2003
  

       I meant the connections were anonymous. The unique identifiers can be public.   

       The concept of transitive trust is well established (c.f. Kerberos). The problem, at the personal level, is that you might trust me but I might not trust you. How do you document that relationship? If that relationship is public - and you can see I don't trust you - would you continue to declare your trust in me? I doubt it.
phoenix, Sep 26 2003
  

       Those scenarios can be handled in PGP. You can mutually sign each others' keys (or not), and signatures on keys can be revoked.
krelnik, Sep 26 2003
  

       Phoenix -- can you describe your idea a little more? What are the uids for?   

       OR maybe you want to post a new idea?
lawpoop, Sep 26 2003
  

       I don't think I can explain it any better than I have, at least not without diagrams.   

       Both of our concepts rely on transitive trusts (A trusts B; B trusts C; so A trusts C). My concept just hides the trust relationships in a database so they can't be tampered with.
phoenix, Sep 26 2003
  

       Can you post an example conversation?   

       How does your database hide the relationship? Who holds that database? Is it a central place? Another party?
lawpoop, Sep 26 2003
  

       Well, the way PGP does it is some key trusted parties (one being MIT) run "key servers" where members of the public can store and retrieve their keys, including the signatures that they have gotten from others who "trust" that key. You download the keys you are interested in from these servers, and your local key management software calculates whether you "trust" a given key based on the other keys you have in your keyring. (If A trusts B, and you have A's key and B's key signed by A, you can cryptographically verify the signature locally, without having to talk to A or B).
krelnik, Sep 26 2003
  

       What about your system, pheonix?
lawpoop, Sep 27 2003
  

       Okay, sample conversation:
A: Will you take a check?
B: Can I trust you?
A: Let's see!
<A provides his trust code. B does same. A few seconds pass then a confirmation is returned.>
B: Okay, I'll take your check.
<Some days later, after the check clears, B acknowledges the transaction, reinforcing the connection between B and A.>
  

       I'm deliberatly vague on the process for providing the trust code and receiving the confirmation because I think it doesn't matter much.
phoenix, Sep 28 2003
  

       How do they communicate anonymously? They are using IP addresses, or what?
lawpoop, Sep 28 2003
  


 

back: main index

business  computer  culture  fashion  food  halfbakery  home  other  product  public  science  sport  vehicle