edifyingGerbil
Apr 22, 10:33 PM
Would it make a difference if a huge portion of what you've been exposed to, regarding religion/Christianity, was fundamentally incorrect? For example, there's no such place as hellfire; nobody is going to burn forever. Everybody isn't going to heaven; people will live right here on the earth. If you learned that a huge portion of those really crazy doctrines were simply wrong, would it cause you to view Christianity/religion differently?
A lot of people need the threat of hell to make them behave or act ethically/morally. What could be worse than eternal damnation?
Certainly nothing physical.
A lot of people need the threat of hell to make them behave or act ethically/morally. What could be worse than eternal damnation?
Certainly nothing physical.
AppliedVisual
Oct 21, 12:42 PM
I'm Speechless. All I can think of is "Wow!"
Makes 20" 1600 x 1200 look puny and the 24" 1920 x 1200 modest.
Yep. Now that I've gone with the 30", I feel so cramped on anything smaller. The dual 30" config is awesome... More than enough space to leave all kinds of stuff accessible - it's insanely wonderfully cool.
...Which brings up my little learning experience over the past couple days. I fired up my 30" as the second display on may G5 quad and all was well. But I was starting to have second thoughts about crowding my desk at home. I packed it back up and took it to the office, plugged it in. Came right up, but I couldn't set the resolution on it to anything higher than 1280x800. Hmmm.... Both had the same video card, (or so I thought), both were the same system, the one at the office was manufactured 12/05, the one at home was 10/05. So I try some different software re-installs and whatnot can't figure it out. so I jump online and research until I'm blue... The 7800GT only has a single dual-link DVI port. Weird, I thought it had two? So I packed the monitor back up, took it home to see what was up... Before plugging it into my quad at home, I started to move the system to open it up and noticed the extra fan openign next to the DVI connectors and the round mini-din style connector. WTF! So I popped the lid real quick to make sure I wasn't hallucinating. This system has the FX4500 and I never even noticed until now. I guess I never checked. :o I had to dig out my invoice, it was a refurbished system I bought from a local dealer -- system was a lease return that made it back to them after only 3 months. It supposedly had the 7800GT in it, but nope - FX4500.
Lucky me. :D My resale value on this system just went way up. ;)
How do I look for dead pixels AppliedVisual? Yes I want two. :)
Two kinds of bad pixels usually show on LCD monitors. Dead pixels are pixels that are black and won't do anything, somewhat rare, really. Stuck pixels are pixels where one of the R, G or B elements is "stuck" at a certain color value and won't change. Typically stuck pixels are stuck full-on and will stand out against dark backgrounds. The best way to check for them is to run a full-screen game or program that can show a black background, other color backgrounds can be helful at times too. Stuck pixels will be visible pixels in these situations. Usually, you'll see them when they show up as they do tend to stand out against contrasting backgrounds. Other types of anomalies on these displays are white pixels or sparkles, which can either be static like a dead/stuck pixel or they can move or come and go. These are usually caused by a poor video signal or too much power over the video interface. Sometimes can even be a faulty GPU. Multi-component pixels - where more than just one R, G or B component is stuck on at the same pixel location are often a faulty GPU. But sparkels and multi-component pixels can still be a defective display... I ordered a Dell notebook for an employee a couple years ago and it arrived with hundreds of stuck/multicomp. pixels all around the screen edges. Dell swapped it out, but I know it was caused by the system sitting on a loading dock or in a truck overnight when it got to -25F here. The LCD screen literally froze all around the edges causing irrepairable damage!
The 30" makes such a huge difference in managing windows of different applications simultaneously. I can see why you wanted 2 AV. Tell me, is there a significant improvement inthe design of your 3007 vs the 3005
AFAIK, there never was a 3005 model, only the 3007. Dell didn't announce their 30" display until last December. I ordered mine on Christmas Eve last year and received it the first week of January. It's a 3007 model as well, Rev.A00. The new one is Rev.A02. Both are identical except I find the old one to have a slight tint to the whites. I had to tweak the color profile for the old one a bit to match the new one, but now it's fine. I don't know if it's a difference in revisions or just normal variation between models or what. The difference is slight, and is only noticeable when the two are side by side, which they are. :D On the bright side, with that Dell forum coupon, my new one was nearly $1K cheaper than the first one.
Makes 20" 1600 x 1200 look puny and the 24" 1920 x 1200 modest.
Yep. Now that I've gone with the 30", I feel so cramped on anything smaller. The dual 30" config is awesome... More than enough space to leave all kinds of stuff accessible - it's insanely wonderfully cool.
...Which brings up my little learning experience over the past couple days. I fired up my 30" as the second display on may G5 quad and all was well. But I was starting to have second thoughts about crowding my desk at home. I packed it back up and took it to the office, plugged it in. Came right up, but I couldn't set the resolution on it to anything higher than 1280x800. Hmmm.... Both had the same video card, (or so I thought), both were the same system, the one at the office was manufactured 12/05, the one at home was 10/05. So I try some different software re-installs and whatnot can't figure it out. so I jump online and research until I'm blue... The 7800GT only has a single dual-link DVI port. Weird, I thought it had two? So I packed the monitor back up, took it home to see what was up... Before plugging it into my quad at home, I started to move the system to open it up and noticed the extra fan openign next to the DVI connectors and the round mini-din style connector. WTF! So I popped the lid real quick to make sure I wasn't hallucinating. This system has the FX4500 and I never even noticed until now. I guess I never checked. :o I had to dig out my invoice, it was a refurbished system I bought from a local dealer -- system was a lease return that made it back to them after only 3 months. It supposedly had the 7800GT in it, but nope - FX4500.
Lucky me. :D My resale value on this system just went way up. ;)
How do I look for dead pixels AppliedVisual? Yes I want two. :)
Two kinds of bad pixels usually show on LCD monitors. Dead pixels are pixels that are black and won't do anything, somewhat rare, really. Stuck pixels are pixels where one of the R, G or B elements is "stuck" at a certain color value and won't change. Typically stuck pixels are stuck full-on and will stand out against dark backgrounds. The best way to check for them is to run a full-screen game or program that can show a black background, other color backgrounds can be helful at times too. Stuck pixels will be visible pixels in these situations. Usually, you'll see them when they show up as they do tend to stand out against contrasting backgrounds. Other types of anomalies on these displays are white pixels or sparkles, which can either be static like a dead/stuck pixel or they can move or come and go. These are usually caused by a poor video signal or too much power over the video interface. Sometimes can even be a faulty GPU. Multi-component pixels - where more than just one R, G or B component is stuck on at the same pixel location are often a faulty GPU. But sparkels and multi-component pixels can still be a defective display... I ordered a Dell notebook for an employee a couple years ago and it arrived with hundreds of stuck/multicomp. pixels all around the screen edges. Dell swapped it out, but I know it was caused by the system sitting on a loading dock or in a truck overnight when it got to -25F here. The LCD screen literally froze all around the edges causing irrepairable damage!
The 30" makes such a huge difference in managing windows of different applications simultaneously. I can see why you wanted 2 AV. Tell me, is there a significant improvement inthe design of your 3007 vs the 3005
AFAIK, there never was a 3005 model, only the 3007. Dell didn't announce their 30" display until last December. I ordered mine on Christmas Eve last year and received it the first week of January. It's a 3007 model as well, Rev.A00. The new one is Rev.A02. Both are identical except I find the old one to have a slight tint to the whites. I had to tweak the color profile for the old one a bit to match the new one, but now it's fine. I don't know if it's a difference in revisions or just normal variation between models or what. The difference is slight, and is only noticeable when the two are side by side, which they are. :D On the bright side, with that Dell forum coupon, my new one was nearly $1K cheaper than the first one.
ekwipt
Apr 12, 11:31 PM
Final Cut X looks amazing
I think everyone's got the one click colour correcting thing wrong, it will be great for documentaries or reality TV when you have different format cameras shot by cameramen and producers who may not be exceptional camera operators. You'll be able to cut everything on the timeline and roughly colour correct the shots so they look good for your first edit. It will basically do away with the "Color Corrector 3-way" tool and I'm guessing you'll be able to fine adjust the correction with an interface similar to it.
This event was for Final Cut X, I'm hoping they'll be updating the other apps in the suite and release them on the app store.
I'm thinking FCX will turn into another plugin revenue stream for Apple, something they didn't talk about it in the conference. So you may log on to an Apple Pro store and download plugins from different manufacturers an Apple will take a 10% or so cut on everything. So there will be Logic X and when you download an AU component they'll take a cut, when you download a new FX plugin they'll take a cut.
so maybe Color will be integrated into final Cut X but you'll buy it as a plugin of sorts that will be hosted in Final Cut X.
I think Soundtrack Pro will be turfed and Logic X will have it's features built into it.
Blackmagic Design and AJA might come up with a new Log and Capture tool for their capture cards and you won't be sitting waiting for thing to capture you'll be able to edit as they are capturing.
I don't use DVD Studio Pro and believe it will o away, everything is streamed over the net now anyway. Vimeo, Youtube etc
Have you checked out Sendoid? Studios will be using this more and more to send each other files, no more uploading with YouSendIt or DropBox.
I think everyone's got the one click colour correcting thing wrong, it will be great for documentaries or reality TV when you have different format cameras shot by cameramen and producers who may not be exceptional camera operators. You'll be able to cut everything on the timeline and roughly colour correct the shots so they look good for your first edit. It will basically do away with the "Color Corrector 3-way" tool and I'm guessing you'll be able to fine adjust the correction with an interface similar to it.
This event was for Final Cut X, I'm hoping they'll be updating the other apps in the suite and release them on the app store.
I'm thinking FCX will turn into another plugin revenue stream for Apple, something they didn't talk about it in the conference. So you may log on to an Apple Pro store and download plugins from different manufacturers an Apple will take a 10% or so cut on everything. So there will be Logic X and when you download an AU component they'll take a cut, when you download a new FX plugin they'll take a cut.
so maybe Color will be integrated into final Cut X but you'll buy it as a plugin of sorts that will be hosted in Final Cut X.
I think Soundtrack Pro will be turfed and Logic X will have it's features built into it.
Blackmagic Design and AJA might come up with a new Log and Capture tool for their capture cards and you won't be sitting waiting for thing to capture you'll be able to edit as they are capturing.
I don't use DVD Studio Pro and believe it will o away, everything is streamed over the net now anyway. Vimeo, Youtube etc
Have you checked out Sendoid? Studios will be using this more and more to send each other files, no more uploading with YouSendIt or DropBox.
johnnyfiive
Sep 13, 12:53 PM
Add me to the excessive dropped call list, keep getting them randomly over the passed two weeks at my house. I'm going to call AT&T today, hopefully score a MicroCell.
rtdunham
Sep 22, 11:33 PM
I'm not seeing any consensus interpretation that suggests anything of the sort. I can also say with some certainty that the hard drive is "not just for buffering"...It makes no sense for Apple to sell an STB that requires a computer...there's absolutely nothing about the iTV that suggests it's some pricy bolt-on for an existing multimedia computer installation. There'd have been no point in pre-announcing it if it was, and it'd be a complete disaster if it were.
Perhaps we've just been exposed to different sources of info. I viewed the sept 12 presentation in its entirety, and have read virtually all the reports and comments on macrumors, appleinsider, think secret, engadget, the wall street journal, and maccentral, among others. It was disney chief bob iger who was quoted saying iTV had a hard drive; that was generally interpreted (except by maccentral, which took the statement literally) to mean it had some sort of storage, be it flash or a small HD, and that it would be for buffering/caching to allow streaming of huge files at relatively slow (for the purpose) wireless speeds.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
I can understand Job's being vague about whether it'll have 802.11g or n. But wouldn't it be nice if, ten days after the product was "revealed", we at least knew WHAT it was (HD or not? etc.) and HOW it will work (still many questions about that). Talk about an RDF!
Perhaps we've just been exposed to different sources of info. I viewed the sept 12 presentation in its entirety, and have read virtually all the reports and comments on macrumors, appleinsider, think secret, engadget, the wall street journal, and maccentral, among others. It was disney chief bob iger who was quoted saying iTV had a hard drive; that was generally interpreted (except by maccentral, which took the statement literally) to mean it had some sort of storage, be it flash or a small HD, and that it would be for buffering/caching to allow streaming of huge files at relatively slow (for the purpose) wireless speeds.
I'm perfectly willing to be wrong. But i don't think i am. Let's continue reading the reports and revisit this subject here in a day or two.
I can understand Job's being vague about whether it'll have 802.11g or n. But wouldn't it be nice if, ten days after the product was "revealed", we at least knew WHAT it was (HD or not? etc.) and HOW it will work (still many questions about that). Talk about an RDF!
DeathChill
Apr 20, 10:04 PM
So this site is for fanboys only?
Yes, because someone who doesn't hate a company and its products has to be a fanboy.
Yes, because someone who doesn't hate a company and its products has to be a fanboy.
Rt&Dzine
Apr 26, 05:16 PM
You gotta do better than youtube videos.
Can you cite anything verified scientifically?
The Nun Bun. Verified to look like Mother Theresa.
283095
The Jesus toast. Verified to look like Jesus or Jeff Daniels.
283096
Can you cite anything verified scientifically?
The Nun Bun. Verified to look like Mother Theresa.
283095
The Jesus toast. Verified to look like Jesus or Jeff Daniels.
283096
Xibalba
Oct 7, 04:04 PM
Of course Android might surpass the iPhone. The iPhone is limited to 1 device whereas the Android is spanned over many more devices and will continue to branch out.
This image called each-
Island each, South Pacific,
Paradise Wallpaper
Tropical Beach Wallpaper
desktop wallpaper beach.
ffakr
Oct 6, 12:00 AM
I must love punishment because I scanned this whole tread. We need some sort system to gather the correct info into one location. :-)
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
Multimedia, you're so far out of mainstream that your comments make no sense to all but .01 % of computer users.
Seriously.. Most people don't rip 4 videos to h264 while they are creating 4 disk images and browsing the web.
I work at a wealthy research university, I set up a new mac every week (and too many PCs). A 1st Gen dual 2.0 G5 is plenty fast for nearly all users. I'm still surprised how nice ours runs considering it's 3 years old. In my experience the dual cores are more responsive (UI latency) but a slightly faster dual proc will run intensive tasks faster.
The reality is, a dual core system.. any current dual core system.. is a fantastic machine for 95% of computer users. The Core2 Duo (Merom) iMacs are extermely fast. The 24" iMac with 2GB ram runs nearly everything instantaneously.
The dual dual-core systems are rediculously fast. Iv'e set up several 2.66GHz models and I had to invent tasks to slow the thing down. Ripping DVD to h264 does take some time with handbrake (half playback speed ((that's ripping 1hour of DVD in 30 minutes) but the machine is still very responsive while you're doing that, installing software, and having Mathematica calculate Pi to 100,000 places. During normal use (Office, web, mail, chats...) it's unusual to see any of the cpu cores bump up past 20%.
I'm sure Apple will have 4 core cpus eventually but I don't expect it will happen immediately. Maybe they'll have one top end version but it'd certainly be a mistake to move the line to all quad cores.
Here's the reality...
- fewer cores running faster will be much better for most people
- there are relatively few tasks that really lend themselves to massively parallelizaton well. Video and Image editing are obvious because there are a number of ways to slice jobs up (render multiple frames.. break images into sections, modify in parallel, reassemble...).
- though multimedia is an Apple core market.. not everyone runs a full video shop or rending farm off of one desktop computer. Seriously guys, we don't.
- Games are especially difficult to thread for SMP systems. Even games that do support SMP like Quake and UT do it fairly poorly. UT only splits off audio work on to the 2nd cpu. The real time nature of games means you can't have 7 or 8 independent threads on an 8 core systems without running into issues were the game hangs up on a lagging thread. They simply work better in a more serial paradigm.
- The first quad core chips will be much hotter than current Core2 chips. Most people.. even people who want the power of towers.. don't want a desktop machine that actually pulls 600W from the wall because of the two 120-130W cpus inside. also, goodby silent MacPros in this config.
- The systems will be far too I/O bound in an 8 core system. The memory system does have lots of bandwith but the benchmarks indicate it will be bus and memory constrained. It'll certainly be hard to feed data from the SATA drives unless you've got gobs of memory and your not working on large streams of data (like video).
http://www.tomshardware.com/2006/09/10/four_cores_on_the_rampage/
Finally, Apple's all about the perception. Apple has held back cpu releases because they wouldn't let a lower end cpu clock higher than a higher end chip. They did it with PPC 603&604 and I think they did it with G3 & G4.
It's against everything Apple's ever done to have 3.0 GHz dual dual-core towers in the mid range and 2.33GHz quad-core cpus in the high end.
I see some options here..
Maybe we'll get the dual 2.66 quad cores in one high end system. The price will go up.
Alternately.. this could finally be a rumored Mac Station.. or.. Apple has yet to announce a cluster node version of the intel XServe.
Geez.. almost forgot.
For most people... the Core2 desktop systems bench better than the 4core systems or even the dual Core2 Xeon systems because the DDR2 is lower latency than the FBDIMMs. To all the gamers.. you don't want slower clocked quad core chips.. not even on the desktop. You want a speed bump of the Core2 Duo.
Evangelion
Jul 12, 02:22 AM
Oh really.
Ok, tell me what's out there that can substitute on a professional level Photoshop, After Effects and Illustrator.
I am sure you don't work on the business, so you have no clue.
A follow-up question: why the obsession with Photoshop, After Effects and Illustrator? There are other apps out there as well. Why does it seem that about 105% of Mac-users are Photoshop-users as well (I bet that PhotoShop-users are in fact in the minority)? Everything related to Apple, OS X and Macs seem to boil down to "but what about PhotoShop?". Well, what about it?
You are worried about the fact that Adobe's apps are not yet Universal? Fine, then don't buy a MacIntel. Problem solved.
Ok, tell me what's out there that can substitute on a professional level Photoshop, After Effects and Illustrator.
I am sure you don't work on the business, so you have no clue.
A follow-up question: why the obsession with Photoshop, After Effects and Illustrator? There are other apps out there as well. Why does it seem that about 105% of Mac-users are Photoshop-users as well (I bet that PhotoShop-users are in fact in the minority)? Everything related to Apple, OS X and Macs seem to boil down to "but what about PhotoShop?". Well, what about it?
You are worried about the fact that Adobe's apps are not yet Universal? Fine, then don't buy a MacIntel. Problem solved.
Hastings101
Apr 5, 08:29 PM
Things I miss from Windows:
Select an item, push shift, and select another to select those two items and everything between them.
Start Menu where you can find all of the installed programs easily and a bunch of recent or favorite programs as well (Apple's Menu Bar and the Dock try to accomplish this with recent items and stacks but it's just not as good.)
Being able to easily theme the OS.
Many applications don't quit when you push close a window on Mac. On Windows the program quits. It was a lot easier than having to go up to the menu for the application and hit quit.
When you click maximize on Windows the application takes up all of the available screen space (excluding taskbar) instead of just fitting to what the application is displaying. While I do like what OS X does I wish it wasn't the only option available.
The "Add/Remove programs" thing was also really nice. I know that all you have to do is drag and drop to the trash on Mac but sometimes not all of my applications are in my Applications folder and it's a pain to hunt for something.
I could go on and on but I think that's enough lol.
Select an item, push shift, and select another to select those two items and everything between them.
Start Menu where you can find all of the installed programs easily and a bunch of recent or favorite programs as well (Apple's Menu Bar and the Dock try to accomplish this with recent items and stacks but it's just not as good.)
Being able to easily theme the OS.
Many applications don't quit when you push close a window on Mac. On Windows the program quits. It was a lot easier than having to go up to the menu for the application and hit quit.
When you click maximize on Windows the application takes up all of the available screen space (excluding taskbar) instead of just fitting to what the application is displaying. While I do like what OS X does I wish it wasn't the only option available.
The "Add/Remove programs" thing was also really nice. I know that all you have to do is drag and drop to the trash on Mac but sometimes not all of my applications are in my Applications folder and it's a pain to hunt for something.
I could go on and on but I think that's enough lol.
balamw
Apr 11, 10:57 AM
Would it be possible/legal to create a Virtual machine on my mac mini running OSX Lion (when it's released) if I don't want to upgrade from Snow Leopard to Lion on my mini (when I get it/lion is out)?
Unlikely, but you can install Lion on an external drive and boot from that when you want to.
B
Unlikely, but you can install Lion on an external drive and boot from that when you want to.
B
sinsin07
Apr 9, 06:47 AM
I was thinking the same thing. "In my day" a hardcore gamer was someone that custom built a gaming rig consisting of no less then 2 graphics cards (add a third and get SLI + PhysX), each costing at least if not more then a single PS3, the most expensive 'extreme' cpu they could find, and a small nuclear power plant for a PSU, then boasting about their 3D Mark scores.
Nature Wallpaper- Beach Sunset
PSP Wallpapers Beach. Beach
iphone wallpaper – Beach Home
Beach Wallpaper
Beach club wallpaper
Wallpaper : Beach
Spanky Deluxe
Mar 18, 01:27 PM
It's only fair. After all, paying twice for our data allowance is completely fair and reasonable......
:rolleyes::rolleyes:
:rolleyes::rolleyes:
Dr.Gargoyle
Aug 29, 03:00 PM
if anyone was wondering, Stem cells have the remarkable potential to develop into many different cell types in the body. Serving as a sort of repair system for the body, they can theoretically divide without limit to replenish other cells as long as the person or animal is still alive. When a stem cell divides, each new cell has the potential to either remain a stem cell or become another type of cell with a more specialized function, such as a muscle cell, a red blood cell, or a brain cell.
Dont you think people can google it for themselves if they feel a need to know?
Dont you think people can google it for themselves if they feel a need to know?
ct2k7
Apr 24, 05:07 PM
don't thank me, thank ct2k7 for saying just why islam is a threat to democracy.
Again, I didn't say that. But I thank you for being ignorant to my comments to your quotations made, from incomplete sources, showing your complete lack in want to participate.
So, follow the local law unless a sane muslim man commits apostasy (then sentence him to death as under sharia law).
Except this doesn't work, since a sane Muslim man would not revolt.
follow local law unless someone insults the name of muhammad or who is critical of islam.
The law is only accountable for Muslims.
so right there, we've gotten rid of freedom of speech and freedom of conscience.
:rolleyes:
Again, I didn't say that. But I thank you for being ignorant to my comments to your quotations made, from incomplete sources, showing your complete lack in want to participate.
So, follow the local law unless a sane muslim man commits apostasy (then sentence him to death as under sharia law).
Except this doesn't work, since a sane Muslim man would not revolt.
follow local law unless someone insults the name of muhammad or who is critical of islam.
The law is only accountable for Muslims.
so right there, we've gotten rid of freedom of speech and freedom of conscience.
:rolleyes:
Hodapp
Sep 26, 04:57 PM
And you can swap 'em right in. If Apple doesn't release a Mac Pro upgrade with some other goodies (I'm personally hoping for DDR2, as the 8GB of goofy RAM in my Mac Pro cost me an arm and a leg.) I'm just going to buy a couple quad core chips and toss them in my machine.
Silentwave
Jul 12, 02:55 AM
costs are all over the place here... on one hand the core 2 extreme is more expensive than a wood crest...but on the other the woodie is more expensive since there;s 2 and a more specialized logic board. what do I think will happen? I wouldn't be surprised to see a single woody system, just to save costs by having one type of LB/RAM, and larger quantities of the same processor to keep costs and logistics manageable.
snebes
Apr 20, 09:09 PM
Windows has an option to hide such files. OS/X does not.
Open Terminal, run: ls /
Open the root HD folder in Finder.
See a difference?
Open Terminal, run: ls /
Open the root HD folder in Finder.
See a difference?
alex_ant
Oct 9, 08:12 PM
Originally posted by Abercrombieboy
Alex ant has made some good points on why Macs are a poor buy. They are so much slower and less stable then PC's these days according to everything I read.
Macs aren't a poor buy, though... they're only a poor buy if your primary concern is maximum performance. I doubt they're any less stable than PCs. They are slower, but in my experience they are much more enjoyable computers to use. You will have to weigh your need for performance against this.
Alex ant has made some good points on why Macs are a poor buy. They are so much slower and less stable then PC's these days according to everything I read.
Macs aren't a poor buy, though... they're only a poor buy if your primary concern is maximum performance. I doubt they're any less stable than PCs. They are slower, but in my experience they are much more enjoyable computers to use. You will have to weigh your need for performance against this.
Clive At Five
Sep 20, 07:06 PM
That makes no sense at all..
In order to even view and/or listen to any media from another computer it needs a front row interface.That interface must be on the component itself.So in order for front row to run it must have some kind of O/S built into it.
That's why I said this:
I find it higly unlikely that there's a physical Hard Drive in the box that amounts to anything more than the UI and/or chache/buffer.
Read more carefully.
-Clive
In order to even view and/or listen to any media from another computer it needs a front row interface.That interface must be on the component itself.So in order for front row to run it must have some kind of O/S built into it.
That's why I said this:
I find it higly unlikely that there's a physical Hard Drive in the box that amounts to anything more than the UI and/or chache/buffer.
Read more carefully.
-Clive
SamEdwards
May 13, 05:45 PM
I was looking into the MicroCell because I get tons of dropped calls in my apartment even though I have 4/5 bars. The AT&T employee told me that the Edge network is much less crowded in our area, Santa Monica, CA so you get far less dropped calls. Go to /settings/general/network and turn 'Enable 3G' off.
So far so good.
Since the majority of my problems are at home and I have wifi here it's a reasonable short term solution while they build more towers in our area later this year. Of course I can always turn 3G back on when I'm away from home and want to use the internet capabilities of the phone.
Love to know how it works for others.
Cheers,
Sam
So far so good.
Since the majority of my problems are at home and I have wifi here it's a reasonable short term solution while they build more towers in our area later this year. Of course I can always turn 3G back on when I'm away from home and want to use the internet capabilities of the phone.
Love to know how it works for others.
Cheers,
Sam
citizenzen
Apr 23, 09:35 PM
citizenzen, there are strong elements of faith involved...
Yes, in theistic belief there are.
However, the thread I was responding to specifically tried to logically deduce the existence of God.
Had it been satisfied with basing its belief simply on faith, I'd have very little to say against it.
Honestly, if you really believe in Christianity or any other religion you won't waste your time posting on some internet forum under anonymous names discussing things which ultimately will benefit no one save providing some cheap entertainment.
Google Christian forums (http://www.google.com/search?hl=en&safe=off&qscrl=1&q=christian+forums&aq=0&aqi=g10&aql=&oq=christian+foru).
Then tell them that they're not true believers.
Yes, in theistic belief there are.
However, the thread I was responding to specifically tried to logically deduce the existence of God.
Had it been satisfied with basing its belief simply on faith, I'd have very little to say against it.
Honestly, if you really believe in Christianity or any other religion you won't waste your time posting on some internet forum under anonymous names discussing things which ultimately will benefit no one save providing some cheap entertainment.
Google Christian forums (http://www.google.com/search?hl=en&safe=off&qscrl=1&q=christian+forums&aq=0&aqi=g10&aql=&oq=christian+foru).
Then tell them that they're not true believers.
Edge100
Apr 15, 10:53 AM
Oh man. Utterly ridiculous. I'm trivializing the issue? No, I'm putting it in a more accurate and less political context. And you call that hate!
Second, don't drag me into the ridiculous "born gay / chose to be gay" false dichotomy. I swear that gays invented that one just to trick dimwitted social conservatives into parroting it. It's a really poor rendering of Nature vs. Nurture, which is a spectrum and not a binary condition. And it doesn't matter. It's the behavior which is either morally wrong or isn't, so pick your side and argue it. Just don't argue that a behavior is moral because you were "born that way". That opens up a seriously dangerous can of worms.
You also end up implying that because fat people weren't "born that way", it's ok to mistreat them.
And then you finish it off with "I don't even care if you don't like homosexual people"... well that's great. I never said I don't like homosexual people. But I guess you didn't quite accuse me of that with that sentence either. I don't care if you hate your mom and puppies either. You don't hate your mom, do you? And if you do, why? Why don't you love your mom?
Sigh.
Gay is not a "hip counterculture"; that implies it's a choice, pure and simple. It's a state of being. It's like being 6 feet tall, or having blue eyes, or brown hair. It's simply a characteristic of a person.
You know what IS a choice? Religion. And look at the lengths we go to to protect the right of every last believer to say and do the most ridiculous, hateful things.
Second, don't drag me into the ridiculous "born gay / chose to be gay" false dichotomy. I swear that gays invented that one just to trick dimwitted social conservatives into parroting it. It's a really poor rendering of Nature vs. Nurture, which is a spectrum and not a binary condition. And it doesn't matter. It's the behavior which is either morally wrong or isn't, so pick your side and argue it. Just don't argue that a behavior is moral because you were "born that way". That opens up a seriously dangerous can of worms.
You also end up implying that because fat people weren't "born that way", it's ok to mistreat them.
And then you finish it off with "I don't even care if you don't like homosexual people"... well that's great. I never said I don't like homosexual people. But I guess you didn't quite accuse me of that with that sentence either. I don't care if you hate your mom and puppies either. You don't hate your mom, do you? And if you do, why? Why don't you love your mom?
Sigh.
Gay is not a "hip counterculture"; that implies it's a choice, pure and simple. It's a state of being. It's like being 6 feet tall, or having blue eyes, or brown hair. It's simply a characteristic of a person.
You know what IS a choice? Religion. And look at the lengths we go to to protect the right of every last believer to say and do the most ridiculous, hateful things.