100.000 +

Post a reply

Smilies
:D :) :( :o :-? 8) :lol: :x :P :oops: :cry: :evil: :roll: :wink:

BBCode is ON
[img] is ON
[url] is ON
Smilies are ON

Topic review
   

Expand view Topic review: 100.000 +

Re: 100.000 +

by Lowlander » Thu Feb 04, 2021 5:09 pm

Try File > Maintain Library with complete optimization checked. It can improve performance.

Re: 100.000 +

by hh_perth » Thu Feb 04, 2021 4:26 pm

when selecting a main node (like "Music") on my 300k+ database it takes a very long time to display the information. Having selected the cover view the system most of the time crashes.
A workaround is to move the curser down to e subtree wit a lower amount of information.

When comparing the performance with e.g. Winamp this performance is very poor :(

besides that I am a long tome MM fan :)

Re: 100.000 +

by Peke » Thu Dec 29, 2016 11:04 pm

Lowlander wrote:However there are things you can do to maintain performance. Run File > Maintain Library with complete optimization on a regular basis. Make sure the database file (MM.DB) itself is defragmented on a regular HDD (non SSD) on a regular basis (Defraggler can do this for one file).
I would also add that if you have multiple HDDs you save MM.DB on non system one as it will also speed library access.

Re: 100.000 +

by Lowlander » Tue Dec 27, 2016 2:53 pm

The larger the amount of the files the slower the performance will be, this is logical as more data needs to be processed. So with large Libraries somewhat speedier hardware would be wise.

It also depends on what you're doing and what other things you've done in the database. For example complex AutoPlaylists will slow any operation down that involves AutoPlaylist reading. A larger Library only makes this more pronounced.

However there are things you can do to maintain performance. Run File > Maintain Library with complete optimization on a regular basis. Make sure the database file (MM.DB) itself is defragmented on a regular HDD (non SSD) on a regular basis (Defraggler can do this for one file).

Re: 100.000 +

by Peke » Mon Dec 26, 2016 8:08 pm

Hi,
I have 120k and even it is slower than when I had 50k still works without hiccup.

I suggest that you do backup of MM.DB and then do a full optimize with text search rebuild. Also a user with more than 500k told me that to regain speed he needed to join two SSDs into RAID1 or even try to use M.2 or U.2 SSDs

100.000 +

by Data » Mon Dec 26, 2016 4:04 pm

Hi , my MM is running slower and slower . My database is more then 250.000 items and MM is designed for 100.000 + , could that be the reason ? Is it possible to fix this ?

Top