• Welcome to the Lightroom Queen Forums! We're a friendly bunch, so please feel free to register and join in the conversation. If you're not familiar with forums, you'll find step by step instructions on how to post your first thread under Help at the bottom of the page. You're also welcome to download our free Lightroom Quick Start eBooks and explore our other FAQ resources.
  • Dark mode now has a single preference for the whole site! It's a simple toggle switch in the bottom right-hand corner of any page. As it uses a cookie to store your preference, you may need to dismiss the cookie banner before you can see it. Any problems, please let us know!
Status
Not open for further replies.

Michael Bateman

New Member
Joined
Jul 9, 2016
Messages
22
Lightroom Experience
Intermediate
Lightroom Version
Lightroom Catalogs have to be on a local volume of course. But referenced images can go on a network drive, which is perfect when you have a lot of RAW images, etc. For my part, I sometimes run into trouble because the volume always gets mounted as "home" or "home-1" etc. which can be a problem if you ever connect to more than one NAS as I often do.

I've discovered a great trick on my Mac with something called autofs. Here is a good article on how to keep network volumes mounted using autofs. This works great with my existing catalog images. I can give the network volume a unique name and it appears in a folder called "servers" in my home directory. Lightroom never has trouble finding it.

...except if I try to import from a folder on the share!
....or if I try to use a folder on the share as a destination for an import!
.....or if I try to browse a folder in the share using Bridge!

Yeah, for some reason my Adobe products just don't like it when I mount network folders this way - and I would love to be wrong about that! Anyone?

Most of you probably don't have a "NAS Centric" workflow like me and rely more on externally mounted USB 3 and Thunderbolt Drives for hot and warm projects, maybe using a NAS for cold storage. (Just remember kids, NAS is not Backup, But can be used as part of an overall backup strategy - but I digress!)

I should probably re-think my workflow. But I just love my Canon 7D with the Wireless File Transmitter WFT-E7A. It does a great job of putting all the files from the camera straight onto the NAS where I can use shell scripts to automate my workflow, renaming files, backing them up, etc.

But in the meantime I thought I would reach out here and see what everyone else does. I am new to the forum; this is my first post. Anyone else here use a NAS with LightRoom? Any ideas for me? I have two: A Synology DS1513+ and a Qnap TVS-871T.

In any event, thanks for reading my post to anyone who has made it this far!

-Michael
 
Welcome to the forum Michael!

When you say that Adobe products don't like it when you mount the drives this way, what happens? They just don't show up?
 
Hello! Thanks for the reply. Basically yes, you see the folder where the mounted shares should be and in Bridge you see a "lock" and in LightRoom you see just the folder and no subdirectories. Strangely in the case of Importing from the Share into Lightroom, you can specify "Show SubDirectories" and it will show you the photos you want to import AND EVERYTHING ELSE ON THAT SHARE, so yeah, not very practical. Do you use a NAS? Which? How do you mount the shares on it for use with LightRoom?
 
And by the way, if you think I should call tech support and open a ticket, sure, but I was mostly curious what anyone else does in this situation. If you have never played with one of these machines they are amazing, they are always on and available for chores like cloud backups and any workflow chore you can imagine. I would blog about it myself except I am JUST NOT quite sure I am doing it right myself, so why confuse anyone else!!! : )
 
On more re-reply! Lightroom is COMPLETELY fine with the folders once the photos are in there. It's the import process that cannot navigate around the shared folder, either to specify a FROM directory or a Destination directory. If I map the drives the way one normally would in OSX with "Goto Server..." etc and mounting it through the Finder, I can import from the share and specify any destination folder on the share without a problem. The issue is that subsequently LightRoom can't find them because it remembers the images being in a shared folder called, Home-1, but now something else has co-opted the Home-1 designation and the images are in a folder now designated as Home.

-Michael
 
A WAG would be that it has to do with the mount point. Lr does sometimes get confused about that with /volumes, so perhaps something similar is going on with the shares. But I don't use a NAS with it much so I'm not sure where it looks for 'em.
 
Yes /volumes was where home and home-1 were mounted and truly the only problem was probably (My WAG now!) that the first one to be mounted was home, next home-1. Then if for whatever reason one was disconnected it would be re-mounted as home-2 and if I could trick Lr into finding the photos again, that got written into the link to the image in the database and it would start all over again on the subsequent re-mount.

Look, the main issue is that I want to use the home folder service on each NAS, but give Lr a unique handle for each, right? I will play some more on this and report back. I may be on my own here in the world of LightRoom users with a NAS, but I find that surprising. If you knew what these machines can do when you have the workflow set up properly you'd wonder how you ever managed without it. : )

-Michael
 
In the article on autofs you cite it specifically warns against using /volumes as the mount point. Having the same name doesn't help, since the home-1 and home-2 would change arbitrarily.
 
In the article on autofs you cite it specifically warns against using /volumes as the mount point.
YES, Don't try this at home kids, thanks. I had misgivings about that soon after I posted that. The home folder concept is important and many systems use it. It mounts the folder as "home" when in fact the physical folder on the system is in fact "homes/username" and if they allowed you to mount it as "username" you might have a chance at avoiding these issues so long as you had a different username on each system I suppose. But thanks, yeah, no, don't mount in Volumes. Bad idea.
 
Lightroom Catalogs have to be on a local volume of course. But referenced images can go on a network drive, which is perfect when you have a lot of RAW images, etc. For my part, I sometimes run into trouble because the volume always gets mounted as "home" or "home-1" etc. which can be a problem if you ever connect to more than one NAS as I often do.

I've discovered a great trick on my Mac with something called autofs. Here is a good article on how to keep network volumes mounted using autofs. This works great with my existing catalog images. I can give the network volume a unique name and it appears in a folder called "servers" in my home directory. Lightroom never has trouble finding it.

...except if I try to import from a folder on the share!
....or if I try to use a folder on the share as a destination for an import!
.....or if I try to browse a folder in the share using Bridge!

Yeah, for some reason my Adobe products just don't like it when I mount network folders this way - and I would love to be wrong about that! Anyone?

Most of you probably don't have a "NAS Centric" workflow like me and rely more on externally mounted USB 3 and Thunderbolt Drives for hot and warm projects, maybe using a NAS for cold storage. (Just remember kids, NAS is not Backup, But can be used as part of an overall backup strategy - but I digress!)

I should probably re-think my workflow. But I just love my Canon 7D with the Wireless File Transmitter WFT-E7A. It does a great job of putting all the files from the camera straight onto the NAS where I can use shell scripts to automate my workflow, renaming files, backing them up, etc.

But in the meantime I thought I would reach out here and see what everyone else does. I am new to the forum; this is my first post. Anyone else here use a NAS with LightRoom? Any ideas for me? I have two: A Synology DS1513+ and a Qnap TVS-871T.

In any event, thanks for reading my post to anyone who has made it this far!

-Michael
 
Hello. I just wanted to post a quick update and cautionary note about Lightroom and network drives and NAS systems.

Don't use network drives with Lightroom catalogs or images.

I am a huge fan of Lightroom. I am a huge fan of both Synology and QNAP. But the only way to use these cool devices is to mount a network drive on your computer and lightoom will let you reference images on it and it will work well until it doesn't and it's a really bad idea.

I now have a thunderbolt lacie external drive large enough to accommodate my main catalog and library. It's backup religiously to my NAS which in turn is backed up regularly.

Most of you seem to know this.

It's a shame. I am not gonna vent about it here, like you I have work to do!! But man I long for a product like Lightroom that is a workgroup tool that allows people to collaborate within the same catalog of images.

But if it's just me and my Canon 7D Lightroom is the bomb and I look forward to browsing this forum for tips on how to make it work for me when I leave my desk!! I am eyeing several threads about a travel catalog etc.

Thanks all.

Lightroom Catalogs have to be on a local volume of course. But referenced images can go on a network drive, which is perfect when you have a lot of RAW images, etc. For my part, I sometimes run into trouble because the volume always gets mounted as "home" or "home-1" etc. which can be a problem if you ever connect to more than one NAS as I often do.
 
Hi Michael

Absolutely right that the catalog can't be on the NAS. Images is ok, as long as you know how to reconnect them if they show up as missing because the mount point's changed.

If you're eyeing the travel catalog threads, you might find this one useful: How do I use my Lightroom catalog on multiple computers?
 
It's a shame. I am not gonna vent about it here, like you I have work to do!! But man I long for a product like Lightroom that is a workgroup tool that allows people to collaborate within the same catalog of images.
The LR catalog file is a single user database and uses a DB engine called SQLite. A multi-user database requires User Access Control and referential integrity checks to prevent two users from accessing the same database records at the same time. To get that kind of industrial strength database, you need to put your data on these heavy duty database servers and run database engines like Oracle which can cost thousands of $$ and require a full time staff of dedicated Database Analysts. Then an app like LR needs to be rewritten as a front end to this database. Adobe (I'm sure) has determined that there is not enough demand for this type of multi user database app to justify the cost of development. And if developed, the LR client certainly could not be sold for $10/mo subscription.
 
Hi Michael

Absolutely right that the catalog can't be on the NAS. Images is ok, as long as you know how to reconnect them if they show up as missing because the mount point's...

Just to be clear though in regard to Workflow which is presumably about a repeatable efficient process, I strongly advise against having a catalog reference an image on a network drive. Lightroom will let you but it's a bad, bad idea. Or maybe having 120,000 images totally 2.2 TB is a bad idea. But I will never have a lightoom catalog reference an image on a. Network drive again. It's just too much work to deal with a corrupted catalog and too big of a mess to undo. If anyone has a workflow that relies upon a NAS that works more than 90% of the time let me know.

But yeah getting it to work is not the problem. It's getting it to not fail I have a issue with!!

Thank you! I will check that out. : )
 
Perhaps I'm misunderstanding Michael - exactly what kind of problem were you having when the images were on the NAS? That wouldn't have corrupted a catalog, although it can be a little slow for it to load.
 
The LR catalog file is a single user database and uses a DB engine called SQLite. A multi-user database requires User Access Control and referential integrity checks to prevent two users from accessing the same database records at the same time. To get that kind of industrial strength database, you need to put your data on these heavy duty database servers and run database engines like Oracle which can cost thousands of $$ and require a full time staff of dedicated Database Analysts. Then an app like LR needs to be rewritten as a front end to this database. Adobe (I'm sure) has determined that there is not enough demand for this type of multi user database app to justify the cost of development. And if developed, the LR client certainly could not be sold for $10/mo subscription.

An Oracle database would be certainly an overkill. For example supporting MySQL would be fairly simple, the SQL commands are almost the same. There are state-of-the-art technologies to support multiple database engines with the same pice of code.

Best Regards
Wernfried
 
An Oracle database would be certainly an overkill. For example supporting MySQL would be fairly simple, the SQL commands are almost the same. There are state-of-the-art technologies to support multiple database engines with the same pice of code.

Best Regards
Wernfried
Welcome to the forum.
"Industrial Strength" databases require two components usually on two different computers. a client (The LR part) and the Server (The database part) A major re-write would be required to turn LR into a client component to a MySQL or Oracle type database. (FWIW, I used Oracle as an example since it is a term that most non technical users ace identify MySQL, Microsoft SQL Server, PostgreSQL and IBM DB2 are less well known).
 
Welcome to the forum.
"Industrial Strength" databases require two components usually on two different computers. a client (The LR part) and the Server (The database part) A major re-write would be required to turn LR into a client component to a MySQL or Oracle type database. (FWIW, I used Oracle as an example since it is a term that most non technical users ace identify MySQL, Microsoft SQL Server, PostgreSQL and IBM DB2 are less well known).

No, that's wrong. For a typical database connector it does not matter whether you connect to a local SQLite DB file or an Oracle or MySQL database, no matter if this is hosted on local computer or somewhere else.

See this example written in Perl, it would work for any database with the same UPDATE command:

Code:
if ( db eq "SQLite" ) {
    $dbh = DBI->connect("DBI:SQLite:dbname=C:/Catalogs/Lightroom-2.lrcat", "", "");
} elsif ( db eq "Oracle" ) {
    $dbh = DBI->connect("dbi:Oracle:$db", $username, $password);
} elsif ( db eq "MySQL" ) {
    $dbh = DBI->connect("DBI:mysql:$db:$hostname", $username, $password);
}

$dbh->do("UPDATE Adobe_images SET fileFormat = 'JPG' WHERE id_local = 76543");

Sure, when you really write such program you have to consider many details but in general it works transparent.

And of course, for a "Multi-User Environment" it makes sense to have database running on one central remote server but this is not must.

Best Regards
 
I have a Synology Nas, but do not use it for Lightroom or Photoshop related files. Occasionally, I will use it as an extra external backup for my images.

My preference is to have my Catalog and Cache folders on local SSD drives and images on a fast local drive. In due course I will test 10GbE cards from my workstation to a 10Gbe Nas to see what impact that makes, but not any time soon.
 
I have a Synology Nas, but do not use it for Lightroom or Photoshop related files. Occasionally, I will use it as an extra external backup for my images.

My preference is to have my Catalog and Cache folders on local SSD drives and images on a fast local drive. In due course I will test 10GbE cards from my workstation to a 10Gbe Nas to see what impact that makes, but not any time soon.
***
Hi Gnits, Thank you for your post. Long story short, I am interested in possibly acquiring a Synology NAS... DiskStation DS1817+ | Synology Inc.
I use LR and use a MAC. Could you please elaborate concerning WHY you do NOT chose to use it for Lightroom or Photoshop related files? When you "import" your photos into LR, I thought you mentioned that you were storing your photos "...on a fast local drive...."?
Surely, that local storage would fill-up quickly?
How do you back up those photos?
And how to backup those photos so that Lightroom can keep track of those photos?
Thank you in advance for any comments.
LRnewbie736
 
***
Could you please elaborate concerning WHY you do NOT chose to use it for Lightroom or Photoshop related files? When you "import" your photos into LR, I thought you mentioned that you were storing your photos "...on a fast local drive...."?
Surely, that local storage would fill-up quickly?
How do you back up those photos?
And how to backup those photos so that Lightroom can keep track of those photos?
Thank you in advance for any comments.
LRnewbie736

I dunno about Gnits, but I also use an external SSD and HDD for images, not a NAS. They're faster, and do perhaps more moving around than some people. Local storage can be huge: not only can you have numerous just regular ol' HDDs, but RAID as well. NAS is not necessarily bigger, it's just the way it's connected: network (slow) vs USB, Thunderbolt (faster). "Local" doesn't mean "internal" necessarily.

You can also backup to externals. And/or to the cloud. Gnits, I see, is on Windows. But I use a Mac and use Time Machine to back everything up to alternating HDDs that I alternate off site, and one permanently attached HDD. And to the cloud as well. Lr doesn't need to keep track of where those are as long as you keep the same folder structure; you just reconnect the images to the backed up catalog and you're back in biz.

Hope that helps; if not, sorry for butting in.
 
My setup. Windows.
C drive. SSD internal. O/S and apps. Kept below 100GB used to keep backups small and fast.
G drive. SSD internal. Lr Catalog
P drive. HD internal. Photos and personal data
Q drive. HD internal. Backup of P drive, Lr Cat and C drive.
T drive. Ext HD ...External backup of P drive, Lr Cat and C drive.
Nas drive. Occasional backup of P Drive, Lr Cat and C drive.

Backup routine.
A. At 6am every morning Macrium Reflect automatically backs up my C drive and Lr Cat to Q drive.
B. At 6:10 am every morning GoodSynch copies all new and changed files from P drive to Q drive.
By 6:20 approx my Pc shuts down.


Totally automated. When I wake up I have an email summary confirming all backups successful.

If I import a photo shoot , after the import process I will manually trigger step B above.

The reason I do not use Nas is that when I bought a Nas drive all the network components, including a network card on my PCs, network interface on Nas , my managed switch and Nas speed were max 1 Gb connections. I could not believe how long it took to complete routine tasks.

The situation is different now in that Nas devices can be bought with much faster network connections and fast internal write speeds. But you still need to make sure that there are no bottlenecks, such as slow switches or routers, faulty cables, etc. Your Nas may have a 10Gbe network interface card, but does your workstation, switch and all other bits in the pipeline have these specs and do they all work together.



My P, Q, T drives started off at 1 TB, currently 3 TB and half full, in a few years time probably 8TB. I have approx 85,000 images in my catalog. For my volumes I am not worried about internal drives being too small. When that happens there will be alternative solutions.
 
No, that's wrong. For a typical database connector it does not matter whether you connect to a local SQLite DB file or an Oracle or MySQL database, no matter if this is hosted on local computer or somewhere else.

See this example written in Perl, it would work for any database with the same UPDATE command:
.....
Sure, when you really write such program you have to consider many details but in general it works transparent.
Is it possible to write SOME code that will work on MOST databases - sure. But "in general it works transparent[ly]"? Really?

What you say is simply not true. Not all databases support the same syntax, or the same semantics. Even something as simple as ISNULL as a function may become IFNULL or even COALESCE. But the syntax issues are minor compared to semantics of statements. Case sensitivity, null handling, error handling, transaction handling and scopes, collation sequences, support for nested, embedded, recursive and other advanced queries.

Sorry... it's just not that easy. There is a HUGE price a software developer pays for a decision to support lots of back end databases, both in development cost, and unless you seriously inflate that development cost even moreso in performance and reliability.

Remember... performance and reliability are by far the two most often complaints most of us hear about Adobe.

And of course, for a "Multi-User Environment" it makes sense to have database running on one central remote server but this is not must.

Just for the record, SQLite does support a significant amount of concurrency, and Adobe COULD have decided to use it for multiple users. There are architectural issues in a high update environment but LR usually isn't in a high update mode, and having 2-3 users updating at once is certainly within SQLite's capabilities. There are also concurrency issues in some NAS applications that make locking hard (notably NFS).

Had they wanted to do so, there are of course much better AND FREE options: MySQL and Postgresql are two that are quite compatible. When I install Resolve (Video editing software) it installs Postgresql as a shared working environment automatically.

The decision not to be multi-user is one Adobe made, not BECAUSE of SQLite, but having made it, it made SQLite a more viable alternative. The decision impacts their code in a huge variety of ways -- database access is the least of them; coordinated access and caching are much larger. When multi-user applications, you cannot depend on anything in (non-shared) memory, so memory caching of information becomes a non-starter. Think for a moment about two people editing the same image at the same time? OK, that may be too obvious, how about changing keywords while someone else is assigning them?

The bigger deal is: Multi-user applications are inherently slower than single user applications.

So be careful what you ask for if you are wanting Adobe to make Lightroom multi-user.

PS. Just for the irony of it: I spent most of today migrating a bunch of MySQL stored procedures to Postgresql functions for Zabbix - a product that DOES support both databases, and the migration was pretty much a rewrite. And if you look inside their code (It's open source) you see massive amounts of parallel code for different back ends. It's NOT "transparent".
 
I recently moved my originals to a DS415+. This only works well if you tell LR to prefer smart previews for all work. If you do this then you'll only touch the originals when you import or export them -- which generally is a "get a cup of coffee" operation, regardless of where they sit.

My sources were ~100GB and growing, which was starting to threaten my SSD's capacity. The smart previews are around 20GB.

---------

Around the same time I increased the synology's RAM to 8GB, of which typically 7GB is being used as a file cache. I really think this has eliminated any speed concerns accessing the originals. My home LAN is all wired 1Gb.

I run LR from one Windows and one Mac computer. Moving the sources to the NAS means one level of backup (replicated photos on the two desktops) is gone, so I've added a backup from the NAS to an internet backup (Amazon Drive).

(On second thought .. I suppose the .xmp's are also being written to the NAS .. so far I haven't noticed any irritating delays with this.)
 
SQLite locks the entire database when you have an open transaction. It also does not support any kind of user management or privileges. Thus using SQLite for Multi-User applications have many limitations.
Regarding "transparency" it depends: the more database specific functions you use the less transparency you will have. But when you use the database just as a "stupid" data store it becomes easier.

A Lightroom catalog SQLite database uses tables, indexes and a some simple triggers - that's all. Based on such tiny scope it should be possible to support also other RDBMS. But of course, this is a decision Adobe has to take.
 
Status
Not open for further replies.
Back
Top