Sonos can't process large library, even if it is well below the 65,000 track limit... unless split into separate folders


Userlevel 1
Badge

I have a fairly large music library (~27,000 tracks), which is well below the 65,000 track limit. Recently, after adding some more tracks, I discovered that the Sonos library was only showing about 10% of the tracks… Somehow in the import/indexing process it had broken. I don’t know for sure, but while my library has about 800 album artists, it has far more actual artists (e.g. compilations), and the tracks I recently imported were compilations with many artists. So I suspect that the problem is related to the number of artists.

 

I was able to resolve the issue by splitting my library (on my NAS drive) between a compilations folder and “everything else.” When I did, Sonos was able to import/index the whole library. What that tells me is that I haven’t hit some limit for the library in general (i.e. it can handle all 27,000 tracks) but that the import/index process can’t handle all of the tracks when it runs the indexing process (possibly because of teh number of artists, but that’s just a guess.)

 

The problem is resolved, at least for now, but I worry that as I add music I will hit this limit again, and continuing to split my music library isn’t a great solution.

 

Has anyone else experienced this?

 

This seems to be something that Sonos could easily fix, as it clearly isn’t running up against any hard limit (e.g. memory or similar), otherwise it wouldn’t be able to handle my whole library (which it does.) i.e. This really seems like a bug in the import/indexing process that could be fixed.  


This topic has been closed for further comments. You can use the search bar to find a similar topic, or create a new one by clicking Create Topic at the top of the page.

24 replies

As has been explained in may other threads, the track limit is not a specific number, but a combination of the data in all of the fields that are stored as part of the metadata for each track. And there’s a limit to the amount of memory on each Sonos device that can be dedicated to the size of that. Of course, Sonos restricts that to the devices that could be in your system with the smallest amount of RAM, and doesn’t allow it to be a variable amount of data, since by definition, the library needs to be stored on each and every Sonos device in your system. 

If you have large amounts of metadata, such as lots of artists, or lyrics, or even large names of files (symphonies are a common thing here), it all impacts the amount of data that can be stored. So you should consider that 65K number to be rather fungible, depending on lots of other factors.

I’d think, speaking personally, if it were an “easy fix”, Sonos would have done so long, long ago, as this has come up time and time again in these forums. 

So you should consider that 65K number to be rather fungible, depending on lots of other factors.

Very much so - I’ve hit the limit at about 38k tracks.

I move less used tracks out of the library to make way for new ones. Casting the music to a CCA via a NAS media server allows access to all of the tracks, if I have to use them.

I think that’s one of the reasons that Sonos partnered with Plex, in order to get around that memory limit. You may want to look at that as a potential solution. 

For some of the library functions I think that there is a processing time limit. By splitting the library into smaller shares you may be reducing the running time for each segment.

Examine the length of your file names. Some rippers will attempt throw the whole first stanza of an opera track into the file name. While the system would be fine with a single folder and file names of 1.flac -- 65000.flac and these would be easy to process, the human would have some trouble with this organization. I try to keep the folder and file names broad and short. I’ll have major folders, such as “Christmas”, “Halloween”, “kids”, that allow including and excluding categories for special occasions, then Artist, CD and finally “TRK01.flac … TRK99.flac. 01.flac … 99.flac would be just as valid. This is a decent compromise that allows me to easily find a file when necessary.

The library can be split into up to 16 shares. A Christmas share can easily be included or not.

Up to some operating system limits, SONOS does not care about file size. Another technique to minimize the number of files would be to lash the tracks of large works, such as an opera, into a single track. Many people would never listen to a partial opera.

Userlevel 1
Badge

As has been explained in may other threads, the track limit is not a specific number, but a combination of the data in all of the fields that are stored as part of the metadata for each track. And there’s a limit to the amount of memory on each Sonos device that can be dedicated to the size of that. Of course, Sonos restricts that to the devices that could be in your system with the smallest amount of RAM, and doesn’t allow it to be a variable amount of data, since by definition, the library needs to be stored on each and every Sonos device in your system. 

If you have large amounts of metadata, such as lots of artists, or lyrics, or even large names of files (symphonies are a common thing here), it all impacts the amount of data that can be stored. So you should consider that 65K number to be rather fungible, depending on lots of other factors.

I’d think, speaking personally, if it were an “easy fix”, Sonos would have done so long, long ago, as this has come up time and time again in these forums. 

I think you are missing the point. Yes, I know the track limit is not a fixed number, I’ve read the many other threads that explain that.  But, whether my library is in one folder or two folders, the total data and metadata of all tracks is exactly the same. i.e. I am clearly not hitting a limit on the amount of memory needed to store the metadata or other data. Because it is the same either way. i.e. Once processed it is the SAME library, with the same songs, the same metadata, etc..  It is just processed in two chunks if it is split into two folders. 

 

As for “Sonos would have done so long, long ago...” I haven’t read any threads where the problem was resolved by splitting it into two folders, so I don’t think your point is applicable.

 

Also FYI, I am a software developer myself and have some insight into these things. If I can resolve the problem by splitting my library into two folders, which results in the Sonos software running the indexing process two times (i.e. once for each folder) but writing the results into a single index (to be stored in memory) then it is almost certainly an easy fix.

Userlevel 1
Badge

I think that’s one of the reasons that Sonos partnered with Plex, in order to get around that memory limit. You may want to look at that as a potential solution. 

Except that I am clearly not hitting a memory limit, as the entire library does fit into memory. The problem is not the size of the library (as evidenced by the fact that the library works fine if I split it into two folders.)

Userlevel 1
Badge

For some of the library functions I think that there is a processing time limit. By splitting the library into smaller shares you may be reducing the running time for each segment.

Examine the length of your file names. Some rippers will attempt throw the whole first stanza of an opera track into the file name. While the system would be fine with a single folder and file names of 1.flac -- 65000.flac and these would be easy to process, the human would have some trouble with this organization. I try to keep the folder and file names broad and short. I’ll have major folders, such as “Christmas”, “Halloween”, “kids”, that allow including and excluding categories for special occasions, then Artist, CD and finally “TRK01.flac … TRK99.flac. 01.flac … 99.flac would be just as valid. This is a decent compromise that allows me to easily find a file when necessary.

The library can be split into up to 16 shares. A Christmas share can easily be included or not.

Up to some operating system limits, SONOS does not care about file size. Another technique to minimize the number of files would be to lash the tracks of large works, such as an opera, into a single track. Many people would never listen to a partial opera.

 

File names are not unreasonably long, and the number of tracks is clearly not the problem as it is able to handle the whole library if I split it into two folders.

The running time theory might be it though, although I can’t imagine why Sonos would cap the running time and simply give up after some period of time.

. If I can resolve the problem by splitting my library into two folders, which results in the Sonos software running the indexing process two times (i.e. once for each folder) but writing the results into a single index (to be stored in memory) then it is almost certainly an easy fix.

I have not read the whole thread, so I may have missed something - but per the quoted are you saying that with this, your use case is fully addressed?

 

The running time theory might be it though, although I can’t imagine why Sonos would cap the running time and simply give up after some period of time.

Implied as part of the indexing process is a hash or sorting scheme. These are surprisingly complex processes that have exponential time penalties if the input data is accidentally in the wrong order. One of the simplest sort schemes is blazingly fast if the data is accidentally in the correct order, but is one of the worst possible choices if the data is accidentally in reverse order.

I think that a fundamental SONOS design philosophy is not to allow a system to be trapped in an endless loop of some sort. The most graceful bailout from an apparent endless loop is a time limit. At least there is some sort of error message indicating that the situation is somewhat under control. In the case of our library index, it is possible that, if the process was allowed to run for a few more hours, it could complete normally. I think that each share is treated as an independent block of files, resulting in a smaller time exponent for each of the smaller, more or less independent blocks.

Userlevel 7
Badge +22

Sonos staff can see data you can’t on your Sonos, it might be worth calling and starting a series of tests with them to see if the diagnostics you can submit or their other tools can show them anything that they can share with you.

Userlevel 4
Badge +4

Certainly not a general issue, I have no issues with ca. 45k files in my music library. 

Userlevel 1
Badge

. If I can resolve the problem by splitting my library into two folders, which results in the Sonos software running the indexing process two times (i.e. once for each folder) but writing the results into a single index (to be stored in memory) then it is almost certainly an easy fix.

I have not read the whole thread, so I may have missed something - but per the quoted are you saying that with this, your use case is fully addressed?

Not fully addressed.  I have found a workaround, at least for now. but somewhere there is either a bug (e.g. a memory leak) that can be fixed, or a design limit that is unrelated to the physical memory limits of the devices, and which therefore can also be fixed.

Userlevel 1
Badge

 

The running time theory might be it though, although I can’t imagine why Sonos would cap the running time and simply give up after some period of time.

Implied as part of the indexing process is a hash or sorting scheme. These are surprisingly complex processes that have exponential time penalties if the input data is accidentally in the wrong order. One of the simplest sort schemes is blazingly fast if the data is accidentally in the correct order, but is one of the worst possible choices if the data is accidentally in reverse order.

I think that a fundamental SONOS design philosophy is not to allow a system to be trapped in an endless loop of some sort. The most graceful bailout from an apparent endless loop is a time limit. At least there is some sort of error message indicating that the situation is somewhat under control. In the case of our library index, it is possible that, if the process was allowed to run for a few more hours, it could complete normally. I think that each share is treated as an independent block of files, resulting in a smaller time exponent for each of the smaller, more or less independent blocks.

Possibly the case, and as a software developer with Comp Sci degree, I am familiar with exponential algorithms. Your comment that “each share is treated as an independent block of files” is undoubtedly correct, which is why splitting my library into two shares resolves the problem. And that also points us at the correct fix, i.e. it should be very straightforward to modify the code so it treats each folder or subfolder “as an independent block of files” and therefore limits the upper bound of the hashing scheme.

Userlevel 1
Badge

Sonos staff can see data you can’t on your Sonos, it might be worth calling and starting a series of tests with them to see if the diagnostics you can submit or their other tools can show them anything that they can share with you.

Did that, and spent probably 8 hours or so (across multiple calls). A lot of that was fruitless (“are your files in MP3 format or some protected format?”) and to be honest I got to a resolution more by thinking about where the problem might be and through trial and error on possible circumventions (e.g. trying to import JUST the newly added tracks to confirm there wasn’t something in those tracks causing it to blow up… which confirmed that the problem was not in the tracks themselves, but in the aggregate of all the tracks… e.g. the # of tracks, or the # of artists, or some other variable that exceeded a limit in the indexing software.)

Userlevel 1
Badge

Certainly not a general issue, I have no issues with ca. 45k files in my music library. 

Which is why I don’t think the problem is with the number of tracks (i.e. I have fewer tracks than you do, and <½ the guideline limit.  Which is why I suspect that the issue is in something like the number of artists (which is large as I have a ~200 albums that are compilations with about 2,000 artists on them plus another 800 artists on non-compilations...) My guess, and it is just a guess, is that there is a table used somewhere in the indexing process that keeps track of artists and it is hitting a limit (that the software doesn’t check for and generate an error message for). 

Userlevel 1
Badge
 

Hi,

just wanted to add that I have a similar issue and am still working on resolving it. I might try you split into multiple folders next.

No selections are available - again and again - no errors reported

To me (also a developer) it is hard to accept that even the Sonos support team don’t have enough information to find out why the index suddenly is gone. Also some visibility during a indexing process is needed, like progress.

Very expensive hardware and appalling software is my verdict - I cannot recommend it.

 

On the Windows Desktop Controller sometimes there is an error message in Help → Error Log…  if you’ve initiated the Library Index from that controller.

Userlevel 1
Badge
 

On the Windows Desktop Controller sometimes there is an error message in Help → Error Log…  if you’ve initiated the Library Index from that controller.

Right, but that is a very much useless log as it just reflects the errors that are displayed on the screen. No further details or reasons for the errors are given. In fact when I build up my library now on 3 folders I kept getting this error over and over and just ignored and continued till eventually it all worked fine. So the error stating that the drive is unavailable is flat out wrong as the mapped NAS drive was there all the time. 

 
 

On the Windows Desktop Controller sometimes there is an error message in Help → Error Log…  if you’ve initiated the Library Index from that controller.

Right, but that is a very much useless log as it just reflects the errors that are displayed on the screen. No further details or reasons for the errors are given. In fact when I build up my library now on 3 folders I kept getting this error over and over and just ignored and continued till eventually it all worked fine. So the error stating that the drive is unavailable is flat out wrong as the mapped NAS drive was there all the time. 

Maybe load the entire local library into a tag editor, like MP3Tag for example and see if there any errors or special characters in the tags and correct them accordingly… also check the Cover Art too.

Here are some of the Sonos library limitations, that I have gathered from other posts here in the community:

Field Name Character Limits

  • Artists - 76
  • Album - 92
  • Track - 100
  • Genre - 22
  • File name - 100
Userlevel 1
Badge
 

On the Windows Desktop Controller sometimes there is an error message in Help → Error Log…  if you’ve initiated the Library Index from that controller.

Right, but that is a very much useless log as it just reflects the errors that are displayed on the screen. No further details or reasons for the errors are given. In fact when I build up my library now on 3 folders I kept getting this error over and over and just ignored and continued till eventually it all worked fine. So the error stating that the drive is unavailable is flat out wrong as the mapped NAS drive was there all the time. 

Maybe load the entire local library into a tag editor, like MP3Tag for example and see if there any errors or special characters in the tags and correct them accordingly… also check the Cover Art too.

Here are some of the Sonos library limitations, that I have gathered from other posts here in the community:

Field Name Character Limits

  • Artists - 76
  • Album - 92
  • Track - 100
  • Genre - 22
  • File name - 100

Hey thanks for this information, I appreciate it, but I am already aware of this and I am using this already. But really it has nothing to do with faulty or oversized metadata. If you read my post here: 

No selections are available - again and again - no errors reported

It states that I had this error coming over and over and did NO change at all to the files. Just by ignoring that error and doing the indexing again it eventually worked. But no error anywhere to see. So this does not point to any faulty/oversized metadata at all (because nothing on the files was changed). All I am pointing out and trying to make people (and Sonos!) aware is that the software that they deliver to a very pricy hardware is mediocre, faulty, incomplete and tedious to use. The only way I could manage the library is to split it into 3 folders with each of them having about 90 folders. 

It takes far too much time to make use of the music library features.

@anmatr,

I’ve just loaded all my local library onto a Sonos Move speaker queue, just to see how many tracks I presently have in my NAS library and there are 25,697 tracks that I am able manually re-index in the Sonos App in precisely 3 minutes - so I’m quite happy with that re-indexing time  - it takes about 6 seconds to open the ‘All Tracks’ Playlist that I created and store within the library and the tracks begin playing (on shuffle) in another 6 seconds or so when I press play… So I’m quite pleased with the time it takes to index and begin playing every track I possess. That said, I have, over the years of building that library, always been very careful with the editing of the track metadata and album art and I do like to keep my music library in good order. I’m not really having any issues with the way the library works with Sonos, or any other App for that matter.

Userlevel 1

It states that I had this error coming over and over and did NO change at all to the files. Just by ignoring that error and doing the indexing again it eventually worked. But no error anywhere to see. So this does not point to any faulty/oversized metadata at all (because nothing on the files was changed). All I am pointing out and trying to make people (and Sonos!) aware is that the software that they deliver to a very pricy hardware is mediocre, faulty, incomplete and tedious to use. The only way I could manage the library is to split it into 3 folders with each of them having about 90 folders. 

It takes far too much time to make use of the music library features.

Just confirming that I have the same error, now for the second time. The first time I got around by removing my library and adding it back in chunks, the re-indexing after every added chunk. Which is technically close to your solution, splitting your collection in parts.

Today I lost access to my collection due to the “no selection available”, without changing a thing (and everything has been working fine for weeks).

My re-indexing also times out with a “not available”-error, which is total BS. It takes a looong time before posting the error, so my guess is faulty code. I deduced the same thing you did, the collection (I am at 28k tracks) is well within the limit, but indexing process is flawed. I’ll be damned if I am going to remove and re-copy everything once again just to get my library working again. This sucks big time.

Finally, I can't remember but I do think I got the error after an update last time, and I had an update pending this morning. So updating seems to “crash” a functioning library, and if it is big enough it will not re-index.

Going to parse the MP3 tags, just to be sure. But there it is completely illogical that a functional library would suddenly become dysfunctional due to some tag, especially since not a single byte was changed.

Userlevel 1

Going to parse the MP3 tags, just to be sure. But there it is completely illogical that a functional library would suddenly become dysfunctional due to some tag, especially since not a single byte was changed.

So, I ripped all tags from 28k tracks into Excel, and lo and behold there were 11 tracks with faulty tags, spread over 7 (totally) different albums. By “faulty” I mean corrupt (strange bitrate numbers is a good indicator). Checked them with MP3Tag, interestingly the tags do seem to work in MP3Tag, but the codec info is missing, the bitrate is strange and the Tag field lists them as “ID3v2.3 Error”.

I removed these files from my collection, and now the re-indexing goes through (though it took > 7 minutes, which is unusual) and I have my library back. I can’t be 100% sure this was the reason (since SONOS has imported them twice before), but if faulty tags can mess up the re-indexing process it might be down to “code”.

Thanks to @Ken_Griffiths for suggesting a re-check of the tags!

Just remember, there’s one Sonos device that is parsing the index, so you can see variable speeds depending on all sorts things, from network speed to CPU speed on the Sonos device. If you can ‘force’ the index process to occur on a newer Sonos, it should be quicker than an older Sonos that has a slower CPU.