• Content count

  • Joined

  • Last visited

About dougclayton

  • Rank

Contact Methods

  • Website URL http://
  • ICQ 0

Profile Information

  • Location North Carolina
  1. Wet Shaving

    Probably for the same (good) reasons that fast food is a multi-billion dollar industry.
  2. Buying a new computer

    Mentioning SLI support? Now come on....
  3. Buying a new computer

    Burgess, On re-reading your initial post I see you have an excellent grasp of the issues involved in choosing the right computer. Your numbered priorites are excellent and, in my mind, reinforce my suggestion to go with a Mac. It meets so many criteria that it easily overwhelms the one reason you say you want to stick with a PC (less new to learn). True, a Mac will involve learning new things, but by and large it will be "fun" learning, rather than feeling like you have to master some arcane trivia to make your computer work. To follow up to my previous post, I can offer some suggestions on the equipment involved. These trends have been true for as long as I can remember (about 15 years), and they will remain in effect as long as we are using similar technology. There are three main variable elements in a computer: 1. The processor, or CPU. Here it's best to choose one on the cheap end of the curve. You pay far more for "high-end" processors than you will ever see in actual performance. For instance, it's not uncommon to pay twice as much for 5% more potential performance, which you will never see in practice. It doesn't matter what the actual performance is--the price/performance ratio has always been maximized on the low end. (One special case to consider if you go with a Mac: make sure it is one with the new Intel processors, for future compatibility.) Buy one on the low end and save money for the two more noticeable variables below. 2. The hard drive. Here, it's better to be in the mid-to-high end, just because upgrading later is a pain. Often you only pay 5% more for twice the space, which is a bargain you will definitely notice (unlike theoretical processor performance). 3. The memory, or RAM. This tends to be more linear: twice the RAM costs about twice as much. Therefore it's best to stick to the midrange unless you know in particular what your needs are. (This is also trivial to upgrade later, so you shouldn't pay too much now--it will only be cheaper later.) To summarize, CPUs: buy slow; hard drives: buy big; RAM: buy "medium-sized". Everything else, with the notable exception of the monitor, is "binary": it either has that capability, or it doesn't (a CD burner, a wireless card, a modem, etc). If you stop by the Apple store at an uncrowded time, they should be willing to take all the time you need to help you decide which extras you do and don't need.
  4. Buying a new computer

    Burgess, I'd like to strongly disagree with those who suggested various components you should buy, with the implication you should assemble your own computer (with your friend's help). I recently upgraded most of my computer piece-wise, and so I know just how hard it is to keep up with all the technology and make sure everything will work together. Buying a pre-assembled computer from a company you trust saves you all that headache. (To see why, ask yourself if you would ever get a new car by assembling it from parts? Sure, gearheads would, because they know and care about all the technology--but I know you don't, based on your original post.) Not only would you either have someone pick everything out and put it together, but you would have no one to turn to when something goes wrong. You complained about your Dell support, with justification, but imagine if there was no single person company to call? You're not likely to get warranties on parts, and to exercise any warranties you had you'd have to troubleshoot the problem to know which part went wrong, take it out, mail it in, and put in the new part when it arrives. With a preassembled computer, they do that troubleshooting and replacement for you. In that vein, I'd recommend seriously considering an Apple computer. Their machines are the closest you are likely to come to an "appliance" that works out of the box like a toaster. They work with the internet just as well as a PC, and you can get Office for some degree of compatibility with your existing Word files, though you may have to save them to an mutual "interchange" format. They cost more, but which is worth more to you: your time to worry about assembling your own, your time to worry about Windows problems, or your money for a Mac? The cost differential for a Mac is not that great. On the other hand, I do agree with the suggestion to move away from floppies. You will have to buy an extra "floppy drive" for most new machines these days, and floppies degrade notoriously. The two other suggestions, writable CDs and USB "keychain" drives, are both excellent choices. They work interchangeably on PCs and Macs (and will for at least a decade into the future), and have hundreds of times more storage room than floppies. Also, nearly all new machines come with a CD burner, so you don't have to buy anything extra to use them. (You don't need the more expensive DVD burners if you are coming from floppies, believe me.) For longevity with CDs, go with high-quality write-once (CD-R) discs rather than rewritable (CD-RW). This does mean you can't overwrite files on there, but since they are so much bigger than floppies and yet cost nearly nothing per disc, it doesn't really matter if you build up a history on the CD before it gets filled up. Hope this long post was of some use, Doug
  5. Moral Flaws in Rand's Fictional Heroes

    Do you mean to say that one is evading reality if he does not understand it fully yet? Speaking personally, there have been many ideas that have taken me a while to understand properly (more than a month, at least), but I don't think I was evading reality during the process.
  6. Objectivist view of volition

    After posting this, I realized I could make my point with far fewer words. Determinism and volition are issues of metaphysics, or of "what is": does this entity have free will or is it determined? Predictability and unpredictability are issues of epistemology, or of "what we know": can I know enough about this entity to tell what will happen in the future? While these two are clearly related, they are not the same issue, so showing a high rate of predictability is not the same as showing it is determined.
  7. Objectivist view of volition

    This is an old canard: a rational man will always choose the rational answer, and thus be determined because his choice is predictable. But, in fact, "predictable" and "deterministic" are not the same. First, predictability does not mean determinism: just because one answer or action is overwhelmingly the correct one, does not make the person who answers or acts correctly deterministic. If he could have chosen otherwise, then it was volitional. Even if he never does in a million years, that just makes him consistent. And, in fact, we see people make the wrong choice all the time in such cut-or-dried situation. I suspect you would say they make the wrong choice because they are irrational, but of course that's part of what volition implies: the choice to be rational or irrational. Second, unpredictability does not mean volition, either. For all intents and purposes, the sequence of numbers generated by a pseudo-random number generator on a computer are entirely unpredictable (in fact, the security of all online transactions rests on this fact). Unless you know the computer's starting point and its algorithm, your accuracy at guessing the next bit produced (0 or 1) would be 50%--exactly the same as guessing. But of course a computer is not in the least volitional. (The answer "but if we knew everything about the computer" does not counter this point.)
  8. "What is Consciousness For?"

    I think this is an excellent explanation, and relates to something I've been thinking about while reading this thread. To use your case of our eyes, we have one word for the experience or process of using our eyes ("seeing") and another for the faculty that eyes provide ("vision"). Thus your doctor checks your "vision," and a blind person has lost his "ability to see," but not his "seeing." Furthermore, similarly to consciousness, there is no substitute for actually experiencing seeing firsthand. However, a blind person can come to understand the faculty of seeing, and can identify whether other creatures possess it, even while having no firsthand understanding of what seeing is like. Unfortunately, for consciousness, the same word refers to the experience of being conscious--which cannot be explained or defined, only "pointed out" metaphorically--and to the neurophysiological makeup that gives rise to the experience. This means that the axiomatic concept "consciousness" is a synonym for "awareness" (since consciousness(2) is defined as "the faculty of awareness") but because this dual-usage is an old tradition, we have to rely on context to distinguish the meanings instead. Is this an accurate assessment?
  9. Serenity (2005)

    It did not have them in the TV series, and I thought they looked rather silly. As long as we're on the topic, there's a scene in the movie which shows an external shot of Serenity as Wash is slowly landing it, and it rolls wildly to one side and then back. I highly doubt that a ship like that would be so sensitive to the controls, and furthermore, that anyone would hire a pilot who flew like that. It would be like bottoming out your car's springs because you cornered too hard while parking at the store. But these are minor nitpicks with a show that dares to--gasp!--avoid explosion sounds in outer space.
  10. Serenity (2005)

    There are no movie spoilers in this post, but there is a TV spoiler marked at the end. Mal's performance here gives me chills every time I see it (in fact, just re-reading the words makes the scene real to me again). Now there is a man who knows exactly what he is doing and why.
  11. Sounds right to me. Good luck with your studies!
  12. Without speaking for Burgess, I don't think this is an accurate summary of his position. He said that it is a waste of time to "recreate" a concept, which I take to mean (given the context of your original question) creating it without learning what it is from anyone else first. Thus I understand him as distinguishing between concepts you originate, versus concepts you are taught. On the other hand, your example in math distinguishes between what we prove, versus what we simply take for granted. That is not the same. For instance, I could never have come up with the method of calculus--I am no Newton. If I took your approach, I wouldn't get very far. If I took Burgess' approach, I could learn from those who have gone before me, and come to understand why it is true entirely on my own. However, if I took your new summary's approach, I would just take its validity for granted. Does that make the difference between the three clear?
  13. This is my point exactly.You'll be able to come up with much better stuff after you've really studied Objectivist epistemology.
  14. In honesty, although your ideas are clearly getting more organized, there is still a lot of flailing around. For instance, your description of "characteristic" here is way off the mark. Later, you speak of "logical consequence of the original concept" although we are not talking about concepts, but principles. Similar word mischoices ran throughout your earlier posts. Just as no man can be bigger than his money, no man's thoughts can be more precise than his concepts. I wonder, have you read OPAR (the sections on epistemology) or ITOE? Although they're not light reading, they would clear up a lot of your confusion. I'm glad you are trying to express yourself in your own words rather than blindly quoting Ayn Rand as gospel as so many newcomers do, but a little study at the masters' feet will go a long way. They certainly explain everything far better than I can.
  15. I thought of an example that might make my distinction more clear than the issue of the sun rising. Consider Dr. Peikoff's excellent blood-type example. A medical principle was discovered that, for instance, "All blood with type A is compatible for transfusions." The principle was not "A man with blood type A will live when you give him type-A blood," although on first glance this seems the same. (This compares to "the sun will rise tomorrow.") That would be a lousy principle--there are all kinds of ways a man can die, and having matching blood types is not some magic talisman that wards off death. Thus if you give type-A blood to such a man, and he dies of infection from the operation or gets a disease from the transfused blood, that does not reflect on the blood-type knowledge in the least. No qualifier is needed--the principle was that the man would not die from blood-type incompatibility, and he didn't (he died from some other cause). Now contrast this with the discovery of the Rh-factor. Now there were cases in which a man with type-A blood died as a result of being given other type-A blood. This effect was not accounted for in the original knowledge, and thus this new cause had to be studied so that the principle could be qualified to include it. Does that illustrate why it is very misleading to expect principles derived by an inductive process to "predict the future"?