Announcement

Collapse
No announcement yet.

Database best practices for speed

Collapse
X
  • Filter
  • Time
  • Show
Clear All
new posts

  • Database best practices for speed

    Hi all,

    I use ACDSee to manage my photos as one single collection. However, over time and as my number of photos has grown it feels like things just get slower and slower. Not terrible, but what took two seconds now takes four - and that starts to get annoying. I've got about 62000 images cataloged and I just put them in folders by day taken and group them under the year.

    To the questions.. Do most people just run one DB or do you segment by some method - -e..g DB1 has years 2002-2014 and db2 has 2015 on? If you have done this, can you comment on speed changes and the pain-in-the-butt factor you found this to be as you work with your files.

    Overall, I am surprised by how slow it has become with just a small data set. e.g. using quick-search with an expression like "dave +bennett -kelly" takes 13 seconds. I suggest that there is a HUGE opportunity for better speed in this area. I have noticed that while running ACDSee it is only using about 2.5gigs of ram. My DB on disk is 4.4gigs. I guess this means that it doesn't ever just load everything into memory and blast away. I suspect that many people editing larger collections have well spec'd machines. My PC has 32gb and I'd be quite happy to see ACDSee chew back another 8gigs if it is available. Trading memory for speed is THE classic optimization in computer software. Have at it, I say. Release the hounds. Burn the bytes. Ok, maybe an option to enable "fast search" would be wise so as to keep it able to run on typical laptops.

    Well.. off topic a bit there... back to best practices. If anyone has tips and tricks for how they manage their collections it would be great to share. Anyone running with a single db and a really big collection? Like a million plus??

    DR

  • #2
    From my experience AC searches 20,000 - 25,000 items/second for each search key. Thus SSD and database optimization are of upmost importance. Searches with several phrases like "foo|bar" or "foo +bar" will double the time. Afaik AC searches for every item successively. With 280.000 catalogued items this gives lots of time for postings and coffee.

    Originally posted by daver99 View Post
    To the questions.. Do most people just run one DB or do you segment by some method - -e..g DB1 has years 2002-2014 and db2 has 2015 on?
    It doesn't make sense for me, I need access to all items at all times. Start a poll!
    Last edited by Emil; 10-08-2018, 01:39 AM.

    Comment


    • #3
      +1 Emil
      I am at only 140,000 images and a 69G database.
      The only way I can keep from slowing is weekly db optimizations. (sometimes daily if I have done a lot of file management)
      The only way I could address speed is by using fastest available storage. (NVMe based)

      Basically, the same status as Emil.

      Comment


      • #4
        Originally posted by GusPanella View Post
        +1 Emil
        I am at only 140,000 images and a 69G database.
        The only way I can keep from slowing is weekly db optimizations. (sometimes daily if I have done a lot of file management)
        The only way I could address speed is by using fastest available storage. (NVMe based)

        Basically, the same status as Emil.
        Wow - that seems like a huge db size for 140000 images. Are they jam-packed with exif, itpc and a gillion categories?

        Is it all stock stuff with many keywords all over the place?

        Comment


        • #5
          Originally posted by GusPanella View Post
          I am at only 140,000 images and a 69G database.
          This indeed seems to be a huge db. For our 280,000 catalogued images I have a db size of 'only' 22GB, more than 90% is used for thumbnails. I'd guess GusPanella is using thumbnails bigger then just 320 pixel (yes, this is possible!), or the db optimization fails for some reason.

          Comment


          • #6
            Originally posted by daver99 View Post
            Wow - that seems like a huge db size for 140000 images. Are they jam-packed with exif, itpc and a gillion categories?

            Is it all stock stuff with many keywords all over the place?
            I do not use any ACDSee specific keywords or categories, so it is all EXIT, IPTC, and whatever thumbnails ACDSee wants to do. I am wondering if there is something stuck as Emil suggested. It certainly seems that 69GB may be way too large

            The last time I deleted and recreated the database from scratch was probably two years ago. I remember it taking close to 3 days. Since then, I have added more photos, but I have also increased the speed of the computer, storage, and network.

            I have a short trip planned soon. Maybe I will delete the old database entirely, then let a new one build itself from scratch. I will let you know how it comes out.


            Comment


            • #7
              The "Save as" dialogue (and others) has an option to set the image compression ratio from 0 to 100 and also offers to save this value as default. Do you have it set to 100%?
              I know that this setting also is used for the proxies located in the [developed] folder. I'm not sure, but may be it also is used for the thumbnails. If time permits I'll check this today and report.

              Beware, creating a new db from scratch using the embed/catalogue routine isn't and option, if you use the face recognition of AC 2019, which doesn't embed face related data. If you just want AC to replace all thumbnails there's easier ways.

              Oops, I think I'm hijacking a thread again, sorry. So GusPanella if you want to drill down this, please create a new thread.

              [Edit]
              I checked the compression ratio and found, that it is used for thumbnails too. 100% gives thumbnails six times larger that at 0%!
              Last edited by Emil; 10-09-2018, 11:50 AM.

              Comment


              • #8
                Wow. I do use 100% as my default value. I would never guessed my export settings would impact database size. Thanks for the tip. I will change my default to something small and make a special export for high quality JPGs instead.

                Great tip. Thanks!

                Comment


                • #9
                  Just to close things out...

                  Original database: Image Count: ~140k; Database Size: 69GB

                  Action: Create new database; re-catalog entire entire storage; Changed the default JPG export quality to 25% (was 100%) [Thanks Emil for the tip!]

                  New Database: Image Count: ~125k; Database Size: 26GB

                  Analysis
                  Yes, the original database was optimized before starting the process
                  File count: 140k to 125k: The number of directories in the catalog did not change; I can only attribute this ~10% recuction to something stuck in the database.
                  Database size went from 69GB to 26 GB; Over a 50% reduction!

                  Getting back to the original post
                  In my case;
                  * It takes about 13 - 18 seconds to return the "first set" result from these 125k image database (First set seems to be around 1500 images)
                  * It takes over 3 minutes to complete if a search result returns over 20000 results... it will return the first ~1500 within 13-18 seconds, but the rest take a while
                  * It takes about 20 seconds to search the entire database and return zero results

                  As far as a best practice, I have never considered multiple databases and only use a single. I would think the book keeping and coordination of multiple database would get a bit cumbersome

                  Comment


                  • #10
                    I'm confused. Does this setting (File | Save As) affect ONLY thumbnail size? I would think it would affect the size of actual image files. I wouldn't want anything other than 100% quality if that is true....

                    Comment


                    • #11
                      I change the JPG quality of Batch > Export > "Default"
                      (Not the "Save As" command)

                      Comment


                      • #12
                        Originally posted by ehart View Post
                        I'm confused. Does this setting (File | Save As) affect ONLY thumbnail size?
                        It's used for thumbnails, proxies in [developed] folders and the "saved as" files. Afaik the only function that uses it's own settings is the export dialogue.

                        Comment

                        Working...
                        X