Categories Menu

Please note - We've migrated our forums to Wordpress

Users with 0 posts have been deleted, existing users will need to create a new password. Sign up or sign in here.

Suggestion about test case for Backup software review

Home Forums Tutorials Suggestion about test case for Backup software review

Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
  • #7759

    I have just read over this two tutorial,

    Filing it away – Evaluating file backup software
    Genie Timeline 3 – New robes, better speed, same great power

    and find out you missed some critical test case about backup software,

    I am a programmer, that’s means usually my working dataset is about 1,000,000 files,
    after the Distributed Concurrent Versions System raise up there is usually at least that much files on my hard drive,

    here comes the point,
    Genie Timeline, in my personal experience, is notorious about how bad it handle small files in huge numbers,
    I just pull the Genie Timeline Pro 2012 for testing, and it still sucks at this kind of load,

    just take a look at the attachment,
    I got about 900GB, 1,500,000 files for backup,
    Genie took 1.2GB memory for this job, and infinitely time to finish it,

    really, almost infinitely I think,
    it took 5 secs to copy just one file,
    and with some simple math I think the backup will take 3 months to complete, which no one can tolerate,

    on the contrary, oops backup can make this happened just over night,

    Genie Timeline definitely doing it wrong in some place,
    you should really consider this kind of large amount small file test.


    Hi AndCycle

    Thanks for that very informative post, you are right I haven’t tested a similar kind of scenario to that and that result you got is surprising.

    Is there a file or utility I could use that could generate a large amount of small files perhaps? Then I could include them in a test and do a simple CRC check (assuming the backup ever managed to complete, I’m not waiting 3 months either 🙂 )

    Thanks again for your post,




    sorry for the late of reply, I forgot to check notify,

    I have no experience about generate this kind of specific data,
    I can write a small script to do so but it’s kind of unreal for testing,
    I think pull source tree of Mozilla might fit the test,

    it has around 170,000 files, 1.7G, for mozilla-central,
    you can see I got 1,500,000 files on my computer so you got the idea how many things I am working on,

    it do have lot’s of small file, and it’s a real world usage for a programmer,

    I just need to teach you how to do so, I assume you are a windows user,

    grab TortoiseHG and install it
    open Windows File Explorer, make a new directory.
    right click on it, you should have a TortoiseHG menu, click Clone.
    paste this magic URL
    into source field, then click Clone.
    wait, it will definitely take some time,
    then you get a complete copy of current mozilla-central source tree to get start with.
    you might wanna uninstall TortoiseHG, because you probably won’t need it again,
    it do take some resource from File Explorer in order to show pretty icon.

    if you wanna set a higher bar for your test on small files,
    just take a look at this page, you can see mozilla-central is just one of Repositories list,
    you can take all of them clone one by one to do a crazy test,

    if you get any trouble with this step,

    I can just pack a copy into zip for you to download,
    well this will take more time because my upload is not that fast :p


    Thanks for that, I’m familiar with using TortoiseSVN so I assume it’s very similar for Mercurial. I’ll certainly look into testing that as I was hoping to put Genie, Oops and the new Windows 8 backup head to head at some point.

Viewing 4 posts - 1 through 4 (of 4 total)
  • You must be logged in to reply to this topic.