Hi all,
I use ACDSee to manage my photos as one single collection. However, over time and as my number of photos has grown it feels like things just get slower and slower. Not terrible, but what took two seconds now takes four - and that starts to get annoying. I've got about 62000 images cataloged and I just put them in folders by day taken and group them under the year.
To the questions.. Do most people just run one DB or do you segment by some method - -e..g DB1 has years 2002-2014 and db2 has 2015 on? If you have done this, can you comment on speed changes and the pain-in-the-butt factor you found this to be as you work with your files.
Overall, I am surprised by how slow it has become with just a small data set. e.g. using quick-search with an expression like "dave +bennett -kelly" takes 13 seconds. I suggest that there is a HUGE opportunity for better speed in this area. I have noticed that while running ACDSee it is only using about 2.5gigs of ram. My DB on disk is 4.4gigs. I guess this means that it doesn't ever just load everything into memory and blast away. I suspect that many people editing larger collections have well spec'd machines. My PC has 32gb and I'd be quite happy to see ACDSee chew back another 8gigs if it is available. Trading memory for speed is THE classic optimization in computer software. Have at it, I say. Release the hounds. Burn the bytes. Ok, maybe an option to enable "fast search" would be wise so as to keep it able to run on typical laptops.
Well.. off topic a bit there... back to best practices. If anyone has tips and tricks for how they manage their collections it would be great to share. Anyone running with a single db and a really big collection? Like a million plus??
DR
I use ACDSee to manage my photos as one single collection. However, over time and as my number of photos has grown it feels like things just get slower and slower. Not terrible, but what took two seconds now takes four - and that starts to get annoying. I've got about 62000 images cataloged and I just put them in folders by day taken and group them under the year.
To the questions.. Do most people just run one DB or do you segment by some method - -e..g DB1 has years 2002-2014 and db2 has 2015 on? If you have done this, can you comment on speed changes and the pain-in-the-butt factor you found this to be as you work with your files.
Overall, I am surprised by how slow it has become with just a small data set. e.g. using quick-search with an expression like "dave +bennett -kelly" takes 13 seconds. I suggest that there is a HUGE opportunity for better speed in this area. I have noticed that while running ACDSee it is only using about 2.5gigs of ram. My DB on disk is 4.4gigs. I guess this means that it doesn't ever just load everything into memory and blast away. I suspect that many people editing larger collections have well spec'd machines. My PC has 32gb and I'd be quite happy to see ACDSee chew back another 8gigs if it is available. Trading memory for speed is THE classic optimization in computer software. Have at it, I say. Release the hounds. Burn the bytes. Ok, maybe an option to enable "fast search" would be wise so as to keep it able to run on typical laptops.
Well.. off topic a bit there... back to best practices. If anyone has tips and tricks for how they manage their collections it would be great to share. Anyone running with a single db and a really big collection? Like a million plus??
DR
Comment