Page 2 of 4 FirstFirst 1234 LastLast
Results 11 to 20 of 34

Thread: Mac G4

  1. #11
    Join Date
    Jul 2003
    Posts
    390
    The file system is NTFS.

    Here on my gaming machine I have 1 Gig. of RAM and have my swapfile set to a static size of 1.5 Gig.

    It is just funny to me that the Mac would perform so well doing that yet it is so darned slow doing DC projects compared to my AMDs. :?

  2. #12
    Join Date
    Jul 2003
    Posts
    390
    90 seconds to open the file is excessive. A slow drive or a heavily fragmented drive will do this as will slow memory or a small swap file. If you have 2 physical drives (my machine has 5), split the swap file across the non boot drives in equal chunks. This has a small impact on performance.

    Also, take a look at Diskeeper to defragment your hard drive.

  3. #13
    Join Date
    Oct 2004
    Location
    Edinburgh, UK
    Posts
    170
    As someone said before, macs are optimised to running graphic intensive applications, which DC projects usually arent. I remeber my old geography teacher used to lecture me about how macs were by far and away the best computer out there... so i ran SETIBOINC on one of the schools windows 2000 machines against the mac he insisted on using... i think the pc out performed it by about 5 times, but then when we had the graphical interface open, the windows machine did slow down quite a bit, but the impact on the mac was much less...

  4. #14
    Join Date
    Jul 2004
    Location
    Sussex, UK
    Posts
    3,734
    rrcrain seeing as i have 1 gig memory, it is best for me to therefore have 1024*2.5 = 2560

    do you recommend having the same upper and lower limits aswell as that is also suppose to change performance.

  5. #15
    Join Date
    May 2004
    Location
    Kent, UK
    Posts
    3,511
    I have 1 gig of mem but a swap file of just 200 meg.

    There is a registry hack that forces ram use before swap file but I can't remember where I got it.

    I'm running CB, WP, TKC, Dnet, SoB, multiple browser windows, dvd burning, really big excel spreadsheets and I have no ram issues.

    XP pro SP2. I suppose it's horses for courses.

  6. #16
    Join Date
    Jul 2004
    Location
    Sussex, UK
    Posts
    3,734
    well mine is set for 2560 now and the system is running fine, ill try some harder stuff later like opening big files and editing them etc.
    thx for advice

  7. #17
    Join Date
    Aug 2004
    Location
    Edelstein, Illinois
    Posts
    243
    Yes, by setting the upper and lower limits to the same value will prevent the page file from fragmenting over time preventing your performance from degrading over time due to a fragmented page file.

  8. #18
    Join Date
    Aug 2004
    Location
    Edelstein, Illinois
    Posts
    243
    Amazingly, my 500 meg geneology database file can be opened, closed and manipulated with extreme easy on my machine, but open a LARGE graphics file with the latest version of Paintshop Pro and the program runs out of RAM and I was generous with my page file setting it at 3 gig with one gig of RAM. Watching memory usage, the primary problem appears to be that Paintshop Pro can't use all of the memory.

  9. #19
    AMDave's Avatar
    AMDave is offline Seeker of the exit clause Moderator
    Site Admin
    Join Date
    Jun 2004
    Location
    Deep in a while loop
    Posts
    9,658
    I have been advised to note that some defragmentation is "logical" as opposed to "physical" and although displayed on screen as a lovely solid block of colour, your files and indeed your swapfile may be physically interspersed with blank slots.

    I am advised that a couple of defrag progs do NOT remove physical spacings in files and can in fact be responsible for introducing small spacers within them

    NB - I don't give a blind $%#@ hoo-ha who RAZES me for that comment, it has been demonstrated to me visually at the phisical level with a disk sector editor and a handfull of defraggers ... think how fast the disk is when writing if it is full of blank spaces to write to ... that gives you a great impression that the defragger is doing a fantastic job ... sneaky aren't they!

    It is very simple to avoid this completely. I like to take a leaf out of the *nix book when installing Windows. I allocate a dedicated partition for swap file use on the fastest disk and allocate the largest "fixed size" swap file that will fill that partition. After that it will not be broken up or have to be defragmented etc etc. This is about the fastest and cleanest swap file that I have been able to get under Windows. If you have multiple disks of the same speed it is even better if the swap partition is not on the disk with the highest file access rate so that the heads on the disk with the swap partition are more readily available for the memory paging tasks.

    I have heard varying reports on raid-swap on SATA-RAID on a desktop platform. I don't personally have the equipment for that yet but I have read that there are circumstances (on equally fast disks) where this could be slower than a dedicated single partition, but I think it would average out over time.

    If you are running X10 or Darwin on a G4 or G5 I expect that you have already got a dedicated swap partition.

    As a graphics file is generally manipulated as a single entity in memory, unlike a document or database which can be manipulated in smaller chunks, I can only suggest more RAM if your swap file is still slowing things down.

    I have never visited the Darwin forums for more than a cursory glance, but I could suggest that you start by posting a question hereabouts for further info.
    http://www.mac-forums.com/forums/announcement.php?f=20

    [EDIT]
    I have to agree with Bill...installing Windows XP on a Dual G4 seems like a backward step
    [END EDIT]

  10. #20
    Join Date
    Aug 2004
    Location
    Edelstein, Illinois
    Posts
    243
    A dedicated drive is indeed best for the swap file, but few people go quite that far. You are also correct about defrag programs leaving "spaces" between files. This is the default method that NTFS works. It's intended to allow a file to grow a bit and help prevent it from fragmenting. nice idea, but it completely stops working once the drive is over 70% full. For performance reasons, consider a drive at or over 70% capacity to be full when formated NTFS. What happens is the file system starts utilizing all of these "spaces" it left behind. In my experience, the operating system just does it as a matter of recourse causing your hard drive to fragment all the worse with time.

    This is a simplistic explanation of how a page file becomes fragmented and my explanation may be lacking so bear with me.

    You start out with a clean build and a unfragmented bage file. You add new programs or files, open a few programs and the system demands more virtual memory so the page file grows. Those new files are nicely scattered over the drive thereby preventing the file from simply extening itself so it requests the NTFS file system to allocate it space of X amount. The NTFS system finds 3 open sectors and assigns them to the page file.

    This goes on and on over months and the page file goes from 1 segment to 3 to 5 to 15 and on and on and on... As the file fragments, performance is trashed. OS/2 had the same problem as did NT,Windows 3 and 3.1. The only solution is and has been to choose a generous value and set the upper and lower values the same.

Page 2 of 4 FirstFirst 1234 LastLast

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •