Tommyg117
Aug 5, 09:51 PM
Come on iPod and iPhone! and Mac Pro with blu ray!
Dan==
Jul 31, 12:35 PM
I did see your earlier design, actually. I had though that it was meant to be the same footprint as the Mac Mini. Seeing it again, I can see that I was mistaken. By comparison, my design is 10"W x 11"D x 4"H. I think to bring it down to the MP 8.1"W, it would have to be made taller, to be reasonable.
Yes, mine's about 5" high, which is tall enough so it would probably need some low hand grips or something. I'm not an engineer for these things, so I'm not even sure it would fit everything, but it looks like it might.
Also, in the vein of quibbling, I think that the perforated look of the MP allows for much better cooling, and therefore hotter components, such as extra boards, faster processors, higher-end GPU, etc. That's the reason I went with it... :)
Perforation only might help cooling. I've heard getting cool air on the parts in question is the most important, and internal flow may actually be better served with a mostly (obviously not completely) closed case design. (I'm probably wrong though in my recollection.)
Maybe now I should draw a scene with the Mac++, a keyboard, a mouse, and an ACD. What do you think?
Sure, I'd love to see some more pretty pictures of what we're dreaming about. It's a little like holding a lottery ticket in your hand, waiting for the numbers to be drawn, visualizing what you're going to buy with the winnings. :-)
-Dan
Yes, mine's about 5" high, which is tall enough so it would probably need some low hand grips or something. I'm not an engineer for these things, so I'm not even sure it would fit everything, but it looks like it might.
Also, in the vein of quibbling, I think that the perforated look of the MP allows for much better cooling, and therefore hotter components, such as extra boards, faster processors, higher-end GPU, etc. That's the reason I went with it... :)
Perforation only might help cooling. I've heard getting cool air on the parts in question is the most important, and internal flow may actually be better served with a mostly (obviously not completely) closed case design. (I'm probably wrong though in my recollection.)
Maybe now I should draw a scene with the Mac++, a keyboard, a mouse, and an ACD. What do you think?
Sure, I'd love to see some more pretty pictures of what we're dreaming about. It's a little like holding a lottery ticket in your hand, waiting for the numbers to be drawn, visualizing what you're going to buy with the winnings. :-)
-Dan
TheSideshow
Apr 25, 01:35 PM
They cant lose this surely?
Even Android stores your location in the exact same way iOS does.
Except secured
Even Android stores your location in the exact same way iOS does.
Except secured
Willis
Jul 30, 11:09 AM
I think that the bigger issue with Dan=='s design (full credit and kudos for the idea!) is that the Mac Mini is so small that it only uses laptop components. If you want to have a full-size optical drive or a full-size hard drive, you need to use a larger form factor. This is part of the reason for the size of my design.
Here's a comparison in sizes (I've also changed the floor because my wife thought that the reflection was confusing...)
http://www.ghwphoto.com/3MacsFrontSm.png
http://www.ghwphoto.com/3MacsBackSm.png
Cheers!
actually.... that looks really good. If apple were to incorperate that... man... it'll be a good seller
Here's a comparison in sizes (I've also changed the floor because my wife thought that the reflection was confusing...)
http://www.ghwphoto.com/3MacsFrontSm.png
http://www.ghwphoto.com/3MacsBackSm.png
Cheers!
actually.... that looks really good. If apple were to incorperate that... man... it'll be a good seller
amin
Aug 19, 09:42 AM
You make good points. I guess we'll learn more as more information becomes available.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
Yes under some specific results the quad was a bit faster than the dual. Though with the combo of Rosetta+Photoshop its unclear what is causing the difference. However, if you compare the vast majority of the benchmarks, there's negligible difference.
Concerning Photoshop specifically, as can be experienced on a quad G5, the performance increase is 15-20%. A future jump to 8-core would theoretically be in the 8% increase mark. Photoshop (CS2) simply cannot scale adequately beyond 2 cores, maybe that'll change in Spring 2007. Fingers crossed it does.
I beg to differ. If an app or game is memory intensive, faster memory access does matter. Barefeats (http://barefeats.com/quad09.html) has some benchmarks on dual channel vs quad channel on the Mac Pro. I'd personally like to see that benchmark with an added Conroe system. If dual to quad channel gave 16-25% improvement, imagine what 75% increase in actual bandwidth will do. Besides, I was merely addressing your statements that Woodcrest is faster because of its higher speed FSB and higher memory bus bandwidth.
Anandtech, at the moment, is the only place with a quad xeon vs dual xeon benchmark. And yes, dual Woodcrest is fast enough, but is it cost effective compared to a single Woodcrest/Conroe? It seems that for the most part, Mac Pro users are paying for an extra chip but only really utilizing it when running several CPU intensive apps at the same time.
You're absolutely right about that, its only measuring the improvement over increased FSB. If you take into account FB-DIMM's appalling efficiency, there should be no increase at all (if not decrease) for memory intensive apps.
One question I'd like to put out there, if Apple has had a quad core mac shipping for the past 8 months, why would it wait til intel quads to optimize the code for FCP? Surely they must have known for some time before that that they would release a quad core G5 so either optimizing FCP for quads is a real bastard or they've been sitting on it for no reason.
dsnort
Apr 6, 02:33 PM
..I'd rather drive a BMW, I guess you're all happy with the Hondas :)
Your BMW looks a lot like a Yugo to me.
I kid, I kid!
Your BMW looks a lot like a Yugo to me.
I kid, I kid!
840quadra
Apr 27, 09:49 AM
Incorrect - it's not tracking your direct location as you assert.
For instance, when you're visiting "Harry's Sex Shop and under the counter Heroin sales" it doesn't track that you're actually at that business.
It tracks that your phone contacted "AT&T Cellular Site 601-2L" which might be within line of sight of such a business or it might be in the surrounding neighborhood or somewhat nearby.
My own phone shows that I travel all over the Twin Cities of Minneapolis/St. Paul since I am an IT staffer who journeys between 25 different offices all of the time that are dispersed all over town - and I think you would be hard pressed to find out ANYTHING from looking at that picture, it's a giant mess of dots all over town and one satellite facility southeast of town:
<snip>
Anyway. Yes, an enterprising thief with access to your phone could use it potentially. But as it is, collating that data would require some smarts and effort.
You stole my map!!!
For instance, when you're visiting "Harry's Sex Shop and under the counter Heroin sales" it doesn't track that you're actually at that business.
It tracks that your phone contacted "AT&T Cellular Site 601-2L" which might be within line of sight of such a business or it might be in the surrounding neighborhood or somewhat nearby.
My own phone shows that I travel all over the Twin Cities of Minneapolis/St. Paul since I am an IT staffer who journeys between 25 different offices all of the time that are dispersed all over town - and I think you would be hard pressed to find out ANYTHING from looking at that picture, it's a giant mess of dots all over town and one satellite facility southeast of town:
<snip>
Anyway. Yes, an enterprising thief with access to your phone could use it potentially. But as it is, collating that data would require some smarts and effort.
You stole my map!!!
shamino
Jul 21, 10:07 AM
With all these new technologies with 4, 8 and eventually 24-core capacities (some time in the not too distant future) all running at 64-bit, we musn't forget that software also has tobe developed for these machienes in order to get the most out of the hardware. At the moment we aren't even maximising core-duo, let alone a quad core and all the rest!!!!
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use.
If you get away from the desktop and look to the server market, however, the picture changes. A web server may only be running one copy of Apache, but it may create a thread for every simultaneous connection. If you have 8 cores, then you can handle 8 times as many connections as a 1-core system can (assuming sufficient memory and I/O bandwidth, of course.) Ditto for database, transaction, and all kinds of other servers. More cores means more simultaneous connections without performance degradation.
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.
It really depends on your application.
On the desktop, if you're a typical user that's just interested in web surfing, playing music files, organizing your photo collection, etc., more than two cores will probably not be too useful. For these kinds of users, even two cores may be overkill, but two are useful for keeping a responsive UI when an application starts hogging all the CPU time.
If you start using higher-power applications (like video work - iMovie/iDVD, for instance) then more cores will speed up that kind of work (assuming the app is properly multithreaded, of course.) 4-core systems will definitely benefit this kind of user.
With current applications, however, I don't think more than 4 cores will be useful. The kind of work that will make 8 cores useful is the kinds that requires expensive professional software - which most people don't use.
If you get away from the desktop and look to the server market, however, the picture changes. A web server may only be running one copy of Apache, but it may create a thread for every simultaneous connection. If you have 8 cores, then you can handle 8 times as many connections as a 1-core system can (assuming sufficient memory and I/O bandwidth, of course.) Ditto for database, transaction, and all kinds of other servers. More cores means more simultaneous connections without performance degradation.
Cluster computing has similar benefits. With 8 cores in each processor, it is almost as good as having 8 times as many computers in the cluster, and a lot less expensive. This concept will scale up as the number of cores increases, assuming motherbaords can be designed with enough memory and FSB bandwidth to keep them all busy.
I think we might see a single quad-core chip in consumer systems, like the iMac. I think it is likely that we'll see them in Pro systems, like the Mac Pro (including a high-end model with two quad-core chips.)
I think processors with more than 4 cores will never be seen outside of servers - Xserves and maybe some configurations of Mac Pro. Mostly because that's where there is a need for this kind of power.
netdog
Aug 11, 02:45 PM
I would not consider the entire United States to be just a small pocket on the planet.
In terms of the global mobile market, it is.
The network coverage in America is just awful too. Until I moved to England, I thought that mobile communications were generally problematic. Now I realize that American cellular service just sucks. Even in NYC.
America should have gotten on board with everyone else when networks apportioned and specified that the infrastructure must be GSM. Instead, though bandwidth is not really an open market, but is strictly regulate, they left it up to the providers to implement what they wanted. Now the USA is paying the price as the GSM network is way behind, and Qualcomm's CDMA has been rendered somewhat obsolete given that the rest of the world (other than Taiwan?) has rejected it.
In terms of the global mobile market, it is.
The network coverage in America is just awful too. Until I moved to England, I thought that mobile communications were generally problematic. Now I realize that American cellular service just sucks. Even in NYC.
America should have gotten on board with everyone else when networks apportioned and specified that the infrastructure must be GSM. Instead, though bandwidth is not really an open market, but is strictly regulate, they left it up to the providers to implement what they wanted. Now the USA is paying the price as the GSM network is way behind, and Qualcomm's CDMA has been rendered somewhat obsolete given that the rest of the world (other than Taiwan?) has rejected it.
Soba
Jul 28, 01:02 PM
you can't make a statement like that. that's like saying "i hate general electric air conditioners." what the heck? all CPU's (and air conditioners) do the same thing.
I'm not sure if this was intended as some kind of throwaway comment or not, but this is not even remotely true.
The original poster said he hated the P4, and honestly, the P4 was a lousy chip design from day 1. The original Pentium 4 chips released about 5 1/2 years ago were outperformed in some instances by an original Pentium chip running at 166MHz. The Pentium 4 was an awful architecture in many respects that simply could not be cleaned up enough to be viable; that would be why Intel abandoned it and based its current designs on the Pentium Pro's core (which was really a very decent server chip in the nineties).
When Apple announced last year they were going with Intel, a lot of people agreed it was a good choice based on the current state of the PowerPC architecture and based on Intel's planned chip designs. Personally, I was a bit unsure at the time, but was optimistic about the switch and figured we could scarcely do much worse than sticking with the G5, which was languishing. Turning back the clock a bit, if instead of releasing the G5, Apple had announced a switch to Intel in I would have thought they were crazy. Intel's chips were awful at that time and there wasn't much of a light at the end of the tunnel, either.
CPUs can be very, very different even if the overall system architecture is similar. And I side with the original poster. The P4 was a dog, and thankfully it is about to be buried forever.
I'm not sure if this was intended as some kind of throwaway comment or not, but this is not even remotely true.
The original poster said he hated the P4, and honestly, the P4 was a lousy chip design from day 1. The original Pentium 4 chips released about 5 1/2 years ago were outperformed in some instances by an original Pentium chip running at 166MHz. The Pentium 4 was an awful architecture in many respects that simply could not be cleaned up enough to be viable; that would be why Intel abandoned it and based its current designs on the Pentium Pro's core (which was really a very decent server chip in the nineties).
When Apple announced last year they were going with Intel, a lot of people agreed it was a good choice based on the current state of the PowerPC architecture and based on Intel's planned chip designs. Personally, I was a bit unsure at the time, but was optimistic about the switch and figured we could scarcely do much worse than sticking with the G5, which was languishing. Turning back the clock a bit, if instead of releasing the G5, Apple had announced a switch to Intel in I would have thought they were crazy. Intel's chips were awful at that time and there wasn't much of a light at the end of the tunnel, either.
CPUs can be very, very different even if the overall system architecture is similar. And I side with the original poster. The P4 was a dog, and thankfully it is about to be buried forever.
DakotaGuy
Aug 11, 02:05 PM
The only way this iPhone or whatever it is called will be successful is if they team up with a carrier or carriers and offer promotions on it like all the other cell phone manufactures do. I am not sure about Europe or other parts of the world, but people are used to getting a decent phone for not much money either at their initial contract or every 2 years when the contract is up. Selling an unlocked phone at some outrageous price ($200-300) is not going to cut it when I can go down and get a decent phone for around $50 with rebates from the cell provider and whoever made the phone.
Now I know there are plenty of people who would buy an Apple phone no matter the price, but if you are going to compete with companies like Motorola, Nokia, Samsung, etc. you have to work with carriers and provide great contract prices.
The whole CDMA v. GSM debate is kind of like the PowerPC v. x86 debate.lol Actually from everything I have read CDMA is actually the newer of the 2 technologies and actually has a lot of benefits over GSM. In then end however, both work fine. I think in the US you will find CDMA has a lot better coverage if you look at the coverage maps on the providers websites. With GSM you hit a lot of dead space especially in the rural areas. CDMA pretty much covers the entire US. Now in Europe I know it is different and that GSM is the standard.
Now I know there are plenty of people who would buy an Apple phone no matter the price, but if you are going to compete with companies like Motorola, Nokia, Samsung, etc. you have to work with carriers and provide great contract prices.
The whole CDMA v. GSM debate is kind of like the PowerPC v. x86 debate.lol Actually from everything I have read CDMA is actually the newer of the 2 technologies and actually has a lot of benefits over GSM. In then end however, both work fine. I think in the US you will find CDMA has a lot better coverage if you look at the coverage maps on the providers websites. With GSM you hit a lot of dead space especially in the rural areas. CDMA pretty much covers the entire US. Now in Europe I know it is different and that GSM is the standard.
yfile
Apr 6, 11:38 AM
What do you mean true 3D? Motion 3 integrated 3D reflection, shadows, depth of field, etc.. It was around that time I stopped using After Effects. There are still things that AE can do that Motion can't, but that's mostly due to 3rd party plugins.
I mean 3D objects with materials, textures, shaders, better lighting, better shadows, no crashing several times a day...
3D like ProAnimator FX or Kinemac at least. No plugin required.
I mean 3D objects with materials, textures, shaders, better lighting, better shadows, no crashing several times a day...
3D like ProAnimator FX or Kinemac at least. No plugin required.
DPazdanISU
Aug 7, 03:49 PM
http://events.apple.com.edgesuite.net/aug_2006/event/index.html
padr?
Sep 19, 12:39 PM
thx for your reply,
i will go for the mac pro quad know (i'm updating my home computer wich is a G3, but i'm used to work on a dual G5 for my projects) and yeah i will allways be able then to update later, but how about ram, when DDR3 comes out, i read that its going to replace FB-DIMMs so will that be upgradeble too???
cause these FB-DIMMS are so ********** expensive :) thx
i will go for the mac pro quad know (i'm updating my home computer wich is a G3, but i'm used to work on a dual G5 for my projects) and yeah i will allways be able then to update later, but how about ram, when DDR3 comes out, i read that its going to replace FB-DIMMs so will that be upgradeble too???
cause these FB-DIMMS are so ********** expensive :) thx
technicolor
Sep 19, 08:49 PM
DailyTech (http://www.dailytech.com/article.aspx?newsid=4217) has a mention of the Core 2 Quadro processors.
Pricing mentioned was a little lower than I expected, but it's processors in the Conroe line rather than the Xeon. Having said that, the 3GHz Xeon is slightly cheaper than the 2.93GHz Conroe.
As expected, the highest rated speed mentioned is 2.67Ghz. This intel crap updates far too frequently...ugh
:mad:
Pricing mentioned was a little lower than I expected, but it's processors in the Conroe line rather than the Xeon. Having said that, the 3GHz Xeon is slightly cheaper than the 2.93GHz Conroe.
As expected, the highest rated speed mentioned is 2.67Ghz. This intel crap updates far too frequently...ugh
:mad:
mustangs
Sep 19, 12:00 AM
I purchased my 1.83GHz Mac Book with 1GHz of RAM on Sep 07, and apple sent me an email that it was going to be shipped on the 18th. Today I got this email from Apple "
Efron
get your own Zac Efron#39;s
zac efron 17 again hair.
skunk
Mar 22, 07:03 PM
Whether it turns out to be justified depends on subsequent events.Sticking your neck out there, I see. :)
ctdonath
Mar 22, 02:04 PM
Now it has become a battle of who will get my $500 bucks.
A competitor who fails to show up in time forfeits the match.
Not much of a battle now, is it?
A competitor who fails to show up in time forfeits the match.
Not much of a battle now, is it?
janstett
Sep 15, 07:48 AM
The Today show is an embarrassment. The US major tv networks do not have any real morning news programs. How to trim your dog's ears and an inside look into American Idol contestants is NOT NEWS. It is an entertainment talk show.
The network morning "news" shows have always been fluff. What's worse is that the so-called "hard news" shows are just as bad, and not just in the morning -- CNN, MSNBC, and Fox News all run mindless fluff instead of news. And don't get me started with MSNBC airing Eye-Puss in the Morning.
The network morning "news" shows have always been fluff. What's worse is that the so-called "hard news" shows are just as bad, and not just in the morning -- CNN, MSNBC, and Fox News all run mindless fluff instead of news. And don't get me started with MSNBC airing Eye-Puss in the Morning.
amols
Aug 5, 11:26 PM
No Macbook Pros?? I hope there won't be any. My MBP gets to stay top of the line for few more weeks ;) . Besides, and correct me if I'm wrong, but when was the last time that any notebook was mere updated at WWDC ??
chasemac
Aug 7, 06:07 PM
I keep reading stuff like this. I don't think Time Machine works with the reagular harddrive. You have to use it with an external drive.
Yes, I was wondering the same because it wouldn't make much sense would it.:)
Yes, I was wondering the same because it wouldn't make much sense would it.:)
starflyer
Nov 29, 10:40 AM
most of the new stuff out sucks.
I agree. I am SICK AND TIRED of the music industry blaming lack of sales on piracy! Piracy is actually down from what it was a couple years ago but they still claim profits are worse now than ever.
Maybe if they didnt put out the same cookie-cutter bands year after year, album after album, put out albums with 9 good tracks instead on 1 good one with 15 filler pieces of crap sales might improve!
my $0.02
I agree. I am SICK AND TIRED of the music industry blaming lack of sales on piracy! Piracy is actually down from what it was a couple years ago but they still claim profits are worse now than ever.
Maybe if they didnt put out the same cookie-cutter bands year after year, album after album, put out albums with 9 good tracks instead on 1 good one with 15 filler pieces of crap sales might improve!
my $0.02
Actarus
Apr 12, 02:45 PM
Just what do all you whiners NEED in a smartphone that you can't wait for a 3 month "delay" in release of a phone? Cracks me up.
And if any of you actually switch, I'll bet 2 months after the release of the iPhone 5 you'll be so jealous of its superiority over your current smartphone that you'll end up coming back to Apple. Apple knows this, which is why they laugh in your face.
Apple iPhones are everywhere. I think I saw a gal in line at the supermarket on food stamps whip out an iPhone.
And over 95% of iPhone owners are "dumb" users. They don't visit sites like this and if they are on iPhone 3G will probably upgrade to iPhone 4 if that is all that is available in June/July. And they will be happy. They will hear a little about iPhone 5 in Sept. but won't really care. That's the pulse of the American people. Geeks on this forum are in the minority.
Stop telling silly things. 3 months? you really know that? It could be much more. The lack of info will make me buy an Android with a 4" screen, and that's all. If you don't like my post don't read it. Really fed up of fanboys. The day you will realize that Apple won't never give you anything in exchange, what will you do?
And if any of you actually switch, I'll bet 2 months after the release of the iPhone 5 you'll be so jealous of its superiority over your current smartphone that you'll end up coming back to Apple. Apple knows this, which is why they laugh in your face.
Apple iPhones are everywhere. I think I saw a gal in line at the supermarket on food stamps whip out an iPhone.
And over 95% of iPhone owners are "dumb" users. They don't visit sites like this and if they are on iPhone 3G will probably upgrade to iPhone 4 if that is all that is available in June/July. And they will be happy. They will hear a little about iPhone 5 in Sept. but won't really care. That's the pulse of the American people. Geeks on this forum are in the minority.
Stop telling silly things. 3 months? you really know that? It could be much more. The lack of info will make me buy an Android with a 4" screen, and that's all. If you don't like my post don't read it. Really fed up of fanboys. The day you will realize that Apple won't never give you anything in exchange, what will you do?
Lord Blackadder
Mar 23, 12:50 AM
I initially supported the Iraq invasion. I believed the Bush Administration's case for WMD's - in particular I was swayed by Colin Powell's presentation to the UN. I believed then, as I do now, that Saddam Hussein's government was arbitrary, cruel, corrupt.
Looking back, it should have been obvious to me that there were a huge number of potential pitfalls - lack of support from Iraqis (and to a lesser extent the international community through the UN) being the most critical. While the initial invasion was predictably successful, the entire issue of post-Saddam Iraq had been poorly thought out - to the extent that it was thought out at all. The result is a tragic disaster of truly epic proportions.
Still, even with this tragedy fresh in our minds (and indeed ongoing along with the war in Afghanistan), I find it impossible to look at the Libyan situation and say "we should not intervene". There is much I do not like about how the my country behaves on the international stage, but in this affair I feel that non-intervention is unconscionable.
Looking back, it should have been obvious to me that there were a huge number of potential pitfalls - lack of support from Iraqis (and to a lesser extent the international community through the UN) being the most critical. While the initial invasion was predictably successful, the entire issue of post-Saddam Iraq had been poorly thought out - to the extent that it was thought out at all. The result is a tragic disaster of truly epic proportions.
Still, even with this tragedy fresh in our minds (and indeed ongoing along with the war in Afghanistan), I find it impossible to look at the Libyan situation and say "we should not intervene". There is much I do not like about how the my country behaves on the international stage, but in this affair I feel that non-intervention is unconscionable.