PDA

View Full Version : Mac G4



Keith75
01-04-2005, 06:48 PM
Down here at work my mom was working on an Adobe Illustrator CS file that a customer had given us that was huge! Her dual G4/450 Mac was taking about 90 seconds to open the file then another 45 seconds just to paint the images on the screen. I told her that was rediculous and that we should just work on the file on one of the faster AMDs. We put it on an XP 3000+ with 768 Meg of RAM and a Maxtor 250 Gig. HD, the same drive the Mac has in it, and proceeded to open the file. To my shock it took almost 4 minutes to open and 2 minutes to paint the screen. It has an nVidia GeForce FX5700 in it with 256 Meg of VRAM.

How could this be? Do you think that Illustrator is just optimized for OSX and not XP or what?

Keith

Empty_5oul
01-04-2005, 08:06 PM
just to clarify how large was that file, 90 secs must meen a massive one.

the only thing i can think of is if its checking for updates on every open. This would casue a considerable slowdown, this can be changed under the help menu // Adobe online. I recommend turning it off completely and doing it manually when needed.

only other thing it could be from looking in help is if memory is poorly configured, SWAP files have been edited etc. If everything is default and handled by XP or equivilent it should go as fast as it can.

Keith75
01-04-2005, 08:42 PM
This time wasn't how long it took to load the application. It was already loaded on both machines and there weren't any updates going on.

The XP swapfile was at it's default setting. :D

Empty_5oul
01-04-2005, 09:39 PM
ah, i see.

sorry, cant help you then. that was all i could gleen from their help. so i guess you'll just have to wait for the files to open in their own time lol.

Beerknurd
01-04-2005, 09:56 PM
Try opening it with an Intel. That should speed it up. :twisted:

rrcrain
01-04-2005, 10:05 PM
By default, Windows (all versions) handles the swap file extremely poorly and as time progresses, system performance will degrade as a result. On system install, you need to over ride the default swap file settings by specifying the upper and lower limits to the same value of 2.5 times the physical RAM you have installed.

Hardware wise, Windows XP really needs at least a gig of RAM for high performance when doing graphics manipulation. How is the hard drive formatted? FAT, FAT32 or NTFS?

BigDawg
01-05-2005, 02:33 AM
A MAC will almost always out perform a windows machine when it comes to graphics and graphic related software. That is why most graphic designers use macs.

andrewdodd13
01-05-2005, 10:56 AM
i used to agree with the 2.5 rule, but if you think about it... whats the point in having 4gb RAM just to have a 10gb swapfile?
i usually just make a swap file thats 512-1024mb big if below 2gb of ram, and turn it off if its above, because windows will probably start using swap before you run out of ram, which is a bad performance hit

Empty_5oul
01-05-2005, 11:25 AM
i remember having this conversation on here before but we ended up none the wiser, what size is it best to have for swap file ??

you say 2.5 x memory ???

rrcrain
01-05-2005, 12:52 PM
2.5 X memory.

I'll try to relate what I've read and been told in my MCSE classes, but I'll preface what I'm about to type with the simple disclamer that I remembered the setting better than the reason for it.

Windows handles virtual memory in what I consider to be an odd and extremely inefficient fashion in that it writes everything thats in RAM into the page file on the premise that it might need to be paged out. If indeed it needs to page out that section of memory, its already written expediting the process and simply marks that section of the page file as active. As your app demande more and more memory, the page file will grow as designed, but if you left the page file settings at the default values, the page file starts to become fragmented. Needless to say, as the page file becomes fragmented, performance deterioriates to the point that the machine will even take loner to boot up.

The built in defragmenter will not address page file fragmentation nor does it address directory fragmentation or scattering and to be frank, it does a poor job of defragmenting the drive in general.

Another thing to consider in this mess is what motherboard is being used. Does it have a dual channel memory controller, or is it an older model with a single channel controller? The dual channel board will outperform the single channel board hands down.

I frequently manipulate JPG files of 500 Mb or larger and have acceptable performance. The only issue I have is that Paintshop Pro can't address enough memory in spite of my best efforts at tweaking the system. My geneology database file is now over 500Mb and it does take around 10 seconds to open and 45 seconds to save to a RAID drive.

Anonymous
01-05-2005, 01:23 PM
The file system is NTFS.

Here on my gaming machine I have 1 Gig. of RAM and have my swapfile set to a static size of 1.5 Gig.

It is just funny to me that the Mac would perform so well doing that yet it is so darned slow doing DC projects compared to my AMDs. :?

Anonymous
01-05-2005, 01:38 PM
90 seconds to open the file is excessive. A slow drive or a heavily fragmented drive will do this as will slow memory or a small swap file. If you have 2 physical drives (my machine has 5), split the swap file across the non boot drives in equal chunks. This has a small impact on performance.

Also, take a look at Diskeeper to defragment your hard drive.

andrewdodd13
01-05-2005, 03:53 PM
As someone said before, macs are optimised to running graphic intensive applications, which DC projects usually arent. I remeber my old geography teacher used to lecture me about how macs were by far and away the best computer out there... so i ran SETIBOINC on one of the schools windows 2000 machines against the mac he insisted on using... i think the pc out performed it by about 5 times, but then when we had the graphical interface open, the windows machine did slow down quite a bit, but the impact on the mac was much less...

Empty_5oul
01-05-2005, 04:58 PM
rrcrain seeing as i have 1 gig memory, it is best for me to therefore have 1024*2.5 = 2560

do you recommend having the same upper and lower limits aswell as that is also suppose to change performance.

Ototero
01-05-2005, 05:43 PM
I have 1 gig of mem but a swap file of just 200 meg.

There is a registry hack that forces ram use before swap file but I can't remember where I got it.

I'm running CB, WP, TKC, Dnet, SoB, multiple browser windows, dvd burning, really big excel spreadsheets and I have no ram issues.

XP pro SP2. I suppose it's horses for courses.

Empty_5oul
01-06-2005, 09:43 AM
well mine is set for 2560 now and the system is running fine, ill try some harder stuff later like opening big files and editing them etc.
thx for advice

rrcrain
01-06-2005, 10:22 AM
Yes, by setting the upper and lower limits to the same value will prevent the page file from fragmenting over time preventing your performance from degrading over time due to a fragmented page file.

rrcrain
01-06-2005, 10:27 AM
Amazingly, my 500 meg geneology database file can be opened, closed and manipulated with extreme easy on my machine, but open a LARGE graphics file with the latest version of Paintshop Pro and the program runs out of RAM and I was generous with my page file setting it at 3 gig with one gig of RAM. Watching memory usage, the primary problem appears to be that Paintshop Pro can't use all of the memory.

AMDave
01-06-2005, 11:15 AM
I have been advised to note that some defragmentation is "logical" as opposed to "physical" and although displayed on screen as a lovely solid block of colour, your files and indeed your swapfile may be physically interspersed with blank slots.

I am advised that a couple of defrag progs do NOT remove physical spacings in files and can in fact be responsible for introducing small spacers within them

NB - I don't give a blind $%#@ hoo-ha who RAZES me for that comment, it has been demonstrated to me visually at the phisical level with a disk sector editor and a handfull of defraggers ... think how fast the disk is when writing if it is full of blank spaces to write to ... that gives you a great impression that the defragger is doing a fantastic job ... sneaky aren't they!

It is very simple to avoid this completely. I like to take a leaf out of the *nix book when installing Windows. I allocate a dedicated partition for swap file use on the fastest disk and allocate the largest "fixed size" swap file that will fill that partition. After that it will not be broken up or have to be defragmented etc etc. This is about the fastest and cleanest swap file that I have been able to get under Windows. If you have multiple disks of the same speed it is even better if the swap partition is not on the disk with the highest file access rate so that the heads on the disk with the swap partition are more readily available for the memory paging tasks.

I have heard varying reports on raid-swap on SATA-RAID on a desktop platform. I don't personally have the equipment for that yet but I have read that there are circumstances (on equally fast disks) where this could be slower than a dedicated single partition, but I think it would average out over time.

If you are running X10 or Darwin on a G4 or G5 I expect that you have already got a dedicated swap partition.

As a graphics file is generally manipulated as a single entity in memory, unlike a document or database which can be manipulated in smaller chunks, I can only suggest more RAM if your swap file is still slowing things down.

I have never visited the Darwin forums for more than a cursory glance, but I could suggest that you start by posting a question hereabouts for further info.
http://www.mac-forums.com/forums/announcement.php?f=20

[EDIT]
I have to agree with Bill...installing Windows XP on a Dual G4 seems like a backward step
[END EDIT]

rrcrain
01-06-2005, 11:48 AM
A dedicated drive is indeed best for the swap file, but few people go quite that far. You are also correct about defrag programs leaving "spaces" between files. This is the default method that NTFS works. It's intended to allow a file to grow a bit and help prevent it from fragmenting. nice idea, but it completely stops working once the drive is over 70% full. For performance reasons, consider a drive at or over 70% capacity to be full when formated NTFS. What happens is the file system starts utilizing all of these "spaces" it left behind. In my experience, the operating system just does it as a matter of recourse causing your hard drive to fragment all the worse with time.

This is a simplistic explanation of how a page file becomes fragmented and my explanation may be lacking so bear with me.

You start out with a clean build and a unfragmented bage file. You add new programs or files, open a few programs and the system demands more virtual memory so the page file grows. Those new files are nicely scattered over the drive thereby preventing the file from simply extening itself so it requests the NTFS file system to allocate it space of X amount. The NTFS system finds 3 open sectors and assigns them to the page file.

This goes on and on over months and the page file goes from 1 segment to 3 to 5 to 15 and on and on and on... As the file fragments, performance is trashed. OS/2 had the same problem as did NT,Windows 3 and 3.1. The only solution is and has been to choose a generous value and set the upper and lower values the same.

Empty_5oul
01-06-2005, 05:13 PM
k, thx for all the info.

ill make sure to use programs that dont spread across memory but use it in a sensible way :P

Keith75
01-07-2005, 05:05 AM
Can anyone recommend a good defrag program?

Keith

rrcrain
01-07-2005, 09:28 AM
Check this one out

http://www.executive.com/diskeeper/diskeeper.asp?ad=go13

Ototero
01-07-2005, 11:26 AM
Yep, that's a good 'un.

vaughan
01-07-2005, 11:36 AM
I'm using the registered version of Diskeeper 8 and honestly I cannot tell that it makes any difference to the performance of my machines. I guess I was a sucker for their glitzy advertising spiel.
I ran it on my wife's P4 2.53GHz 80GB WD 8MB cache 7200rpm yesterday. After about 20 minutes of frantic HDD activity it announced defrag was complete. Trouble is the same application said there was 0% fragmentation to begin with. So if the software is so smart then why did it take so damn long? (IARSN's TaskInfo 2002 showed cpu usage was virtually zero to begin).

andrewdodd13
01-07-2005, 12:18 PM
i used diskeeper on windows 98 and thought it was great, because it was much faster. when i came onto xp i looked at the defrag and thought "wtf, did they give me diskeeper with this pc?"... and ok its not exactly the same, diskeeper has advanced scheduling options, etc, but i found perfect disk www.raxco.com to be pretty good at defragging.
firstly, it moves all files related to bootup to the start of the drive, and then arranges your files by date in 3 categories. the other main thing it has over other defraggers is that it can defragment the page file and hibernate file in a special offline defrag mode. this can make quite a big difference.

rrcrain
01-07-2005, 12:26 PM
Diskeeper defrags the page file, MFT and directory defrag offline (on reboot) as well. My experience with the product has shown this to be a critical step in recovering system performance. XP and 2000 aren't as bad as NT was on install, but there is still a performance gain.

Beerknurd
01-07-2005, 01:32 PM
What's wrong with the Windows defrag??? That's what I always use..... :?

Sorry for the dumb question...

rrcrain
01-07-2005, 03:03 PM
It's actually an excelent question.

THe windows defrag does not defrag or consoldate the directory structure, the page file or the MFT. With time, both the page file and the Master File Table (MFT) do fragment and your performance degrades. As your aware, a directory is simply a special file that acts like an index for the files assigned to it. Let the index file become fragmented and file access time goes to hell.

The installed defraagger in Wndows is a stripped down crippled version of Diskeeper, nothing more.

Beerknurd
01-07-2005, 03:19 PM
hmmm interesting.... So diskeeper is a good investment then??? Is there a performance increase after using it???

rrcrain
01-07-2005, 03:29 PM
Depends on what your doing. DC project won't see an improvement, but startup times and file access times should improve. If you install Diskeeper on a new built computer, its unlikely you will see a lot of an improvement, but it will prevent your performance from degrading due to fragmentation.

Also, remember the days of FAT file systems and crosslinked files? Much of that was due to file fragmentation. NTFS is far superior to FAT, but why find out where disaster will strike.

Beerknurd
01-07-2005, 04:36 PM
true.....

Empty_5oul
01-07-2005, 04:41 PM
im just donwloading it, ill give my verdict on diskeeper soon

Beerknurd
01-07-2005, 05:26 PM
I'll wait for your opinion before I do.