kenstauffer

Members
  • Content count

    90
  • Joined

  • Last visited

About kenstauffer

  • Rank
    Member
  • Birthday 01/01/1968

Contact Methods

  • Website URL http://
  • ICQ 0
  1. self-service checkout machines

    You probably think that Jim Taggarts idea to remove diner cars was a great example of conservation of resources. Maybe we should require fast food customers to clean the bathrooms. Then this savings could be passed on to the customer, instead of being wasted on hiring a person to clean the toilets. Heck I bet you can clean toilets faster than most people.
  2. self-service checkout machines

    Here is my take on self-checkout, which I loath.. As long as products have bar codes that must be located before they can be scanned, then it doesn't make sense to make the customer scan their own products. The checkout person is much faster because they know where the barcode is located on the the eggs, milk etc... Whereas for me it is a waste of my time that I could spend on doing soemthing like enjoying life. I hate self-checkout because I find it very degrading to be put into the position of a minimum wage highschooler to buy food. Why did I go to college and spend thousands of dollars and hours learning to use my brain, if when I go to spend my money I end up doing it all myself? Sorry to sound like an elitist jerk, but we all derserve a rich life filled with labor savings technology; not reduced to car pooling in diamond lanes, communal buses, pumping gas, checking your own oil, and checking your own groceries. Screw Albertson's and Wal-mart. I deserve better!
  3. Jokes

    That's okay because greatest number of people on this forum got it.
  4. Moral Dilemma #4

    The reworded quote is much better. In that case: I wouldn't pry further (unless Brad wants to get more specific). I would continue my plan to call Willis. I wouldn't say anything to him (unless he says something about his former friendship with Brad).
  5. Moral Dilemma #4

    I would get Willis's number and then throw out Brad's (metaphorically speaking) What Brad does and says in this example reveals more about himself than Willis. I would wonder if Brad is trying to indirectly punish Willis rather than protect me from him. Unless Willis is a suspected serial killer (purse snatcher, crack dealer, etc...), Brad's first response to my request should be "Sure, no problem.". This example also establishes that Willis and Brad are friends (or were friends). Brad doesn't hesitate to trash his former friend for a relative stranger. That speaks volumes about Brads character. Let consider what Brad is accusing Willis of doing. Brad's accusations are sufficiently vauge that Ayn Rand or Leonard Peikoff could be accused of the same. Either Brad has a very rigid/intolerant standard for who people should associate with (unlikely considering he is outgoing and social), or Brad is subconsciously trying to punish Willis for something (there may be other possibilities). Both of these speak poorly for Brad, not nessasarily for Willis. Finally, Brad doesn't appear to have any problem telling me what to do (don't call him, if you are smart). This is a very presumptious attitude, and I personally don't like to be dealt with this way. What to do? Insist on the number, call Willis, and wait for more information about Brad's character.
  6. Archimedes and the siege of Syracuse

    A productive methodology I somtimes apply for questions like this is to ask myself if there are examples of this technology in the animal kingdom. Given lifes amazing propensity for discovering things like sonar, etc... then it isn't unreasonable to conclude that mirrors are probably not an effective weapon, as life has not converged on this strategy. It's not proof, but I think it's a worthwhile strategy when first "peeling the onion" on such questions. Another element of such thinking, is to try and imagine a series of small incremental steps that each have survival value leading up to the "mirror based weapon". (it's not a requirement that each step be useful as a mirror-based weapon, it just has to be useful for something)
  7. Honesty and space flight

    On Apollo 13, the networks did not carry the capsule broadcast live, is it moral for mission control to withold this from the astronauts?. If the atronauts had explicitly asked if the networks showed their broadcast what should mission control had said? On Apollo 12, lightning caused problems during the launch. Ground controllers were able to verfy the safety of going to the moon. However they had some concern that the pyrotechnics for the re-entry parachutes might not work. Was it wrong to say nothing of this fact to the crew? In both cases, mission control seems to be limiting the information given to crew (rather than outright lying), in order to protect the crew from the facts of reality. Are these examples of lying? What principle makes these interactions justified? Are these just real-life examples of life-boat scenarios?
  8. Narcissism

    Just out of curiosity why are there so many N's in your life? How frequent is this disorder in society?
  9. Halting Problem and Free Will

    Primes are proven to go on forever.
  10. Narcissism

    This link concerns Michael Moore, and how he may have Nacissistic Personality Disorder http://www.mooreexposed.com/mental.html It made sense to me. When I was reading up on N's the one thing that struck me as a very useful diagnostic trait is that N's are happy with bad attention just as well as good attention. This differentiates them from most people (who to some extent want attention too) because most people avoid the bad kind of attention. An example of bad attention comes from the world of crime. A serial killer, when caught, will enjoy tremendously the court appearances, the angry crowds. This relates to Micheal Moore specifically, because he seems to not care about the kind of attention he is getting.
  11. Halting Problem and Free Will

    If that's true then you are a millionaire.
  12. Halting Problem and Free Will

    It is unknown if the loop terminates, as the goldbach conjecture is still unsolved. (My comment for the GOLDBACH said it always terminates, but that doesn't mean it always returns TRUE, if it is found to return FALSE for any 'n' input, then the conjecture is false, so far this function has returned TRUE for values as high as 2 x 10^17)
  13. Halting Problem and Free Will

    I screwed up... Arghhh. I need to change the final loop to be "not GOLDBACH()". Does this loop terminate?        halt := False        n := 2        repeat             if NOT GOLDBACK(n) then                      halt := True;                        n := n + 2;        until halt = True
  14. Halting Problem and Free Will

    This is my implementation for the Goldbach conjecture..   function PRIME(n)  -- True if n is prime  -- (This algorithm terminiates for all n) function GOLDBACH(n)  -- True if 'n' is the sum of 2 primes, else False.  -- (This algorithm terminiates for all n) begin  for x = 1 to n   for y = 1 to n    if PRIME(x) and PRIME(y) and (x + y) = n then     return TRUE  return FALSE; end Does this loop terminate? halt := False n := 2 repeat  if GOLDBACK(n) then   halt := True;  n := n + 2; until halt = True
  15. Halting Problem and Free Will

    But in this case the human is not doing anything that a machine couldn't programmed to do. In a context in which the algorithm parameters are limited (i.e has finite number of states.), then a machine can solve the halting problem too.