Remove 64GB database size limit 
Use this IdeaSpace to post ideas about Domino Server.

: 38
: 40
: 2
: Domino Server / Data Storage and Management
: 64GB, NSF, database
: Nick Radov1404 26 Jan 2008
: / Email

I would like the 64GB limit on NSF and DB2NSF databases to be greatly increased. This limit has been in place since 5.0 and it causes serious problems for large and complex applications. Our web applications store hundreds of GB of medical data (mostly XML and other structured text formats) and so we have had to write a lot of extra code to break up the data into smaller chunks but still allow end users to query it all together.

According to the IBM developers I've talked to, it wouldn't be too difficult to remove the limit. The main issue is that they would have to do a lot of extra testing to ensure reliability. So if this idea gets enough votes perhaps they will commit to do that testing.

The new data document compression feature coming in 8.0.1 and DAOS in 8.5 will help marginally but really don't solve the problem.

1) The Turtle1101 (28 Jan 2008)
64gb? When it gets past about 5Gb I usually restructure the app so that it's not so monolithic. Anything that would approach 64Gb is bound to have massive performance issues in other ways.
2) Nick Radov1404 (28 Jan 2008)
@1: No, performance on databases up to near the 64GB limit is fine as long the application is correctly designed and you have good hardware. Look at it this way: DB2 and other relational databases can easily handle terabytes of data with excellent performance. Why should Domino be so limited?
3) Margaret Suddards13 (30 Jan 2008)
Of course IBM has to ensure that the database can perform at that size, but I agree, why should it be limited. If it wants to grow up and be a real robust database for complex applications, they have to remove that limit.
4) Starrow Pan5103 (20 Feb 2008)
Radov, Rather curious to the design and hardware of your application that could keep relative good performance with about 64 GB data in A DOMINO DATABASE.
5) John Foldager1155 (29 May 2008)
Or at least IBM should make sure that any writes to a database should not get it past the 64GB size, which will corrupt the database!
6) Atilla Öztürk937 (07 Oct 2009)
@Starrow: I think that's not the issue.

IBM could engineer that stuff out:
a good balance between responsiveness of large nsf's and nsf's nifty features like ft-indexing, view refreshes, compacting objects etc.

Or better described:
IBM, just turn the nsf's to become "smart" for itself :-)

The only alternative would be a more simple implementation & integration w/db2 enablement in Lotus Domino...
7) Wim Stevens580 (06 Feb 2011)
We have a few "shared mailbox" applications. These applications constantly grow over time. When size reaches about 15 GB performance becomes an issue.
We expect from IBM good performing databases of size > 64 GB and containing potentialy millions of documents. DB2 and Oracle have no problems with these kind of databases. Domino should be as robust and powerfull

Wim Stevens


Welcome to IdeaJam

You can run IdeaJam™ in your company. It's easy to install, setup and customize. Your employees, partners and customers will immediately see results.

Use IdeaJam to:

  • Collect ideas from employees
  • Solicit feedback and suggestions from employees and customers
  • Run innovation contests and competitions
  • Validate concepts
  • Use the power of "crowd-sourcing" to rank ideas and allow the best ideas to rise to the top

IdeaJam™ works with:

  • IBM Connections
  • IBM Lotus Quickr
  • Blogs and Wikis
  • Websphere Portal
  • Microsoft Sharepoint
  • and other applications.

IdeaJam has an extensive set of widgets and API's that allow you to extend and integrate IdeaJam™ with other applications.

Learn more about IdeaJam >>

IdeaJam developed by

Elguji Software Logo