Defragmentation

Hello All,

I have looked through a few discussions regarding Defragmentation, but, nothing helped me.

My problem is with the LN Client. We are running 6.5.4 and 70% of my users are remote users. Meaning, they replicate applications to their client.

Over the past six months, many of them have tried to defrag their laptops. Although defrag worked for most of the data, a few large and frequently used databases where not defragged (i.e., mail files, crm app.).

We are using basic windows defragmentation (windowsxp). We use IBM Thinkpads (T40, T42) I am not sure which way to go anymore and this issue is becoming an increasing concern as the performance is killing my users.

Can anyone provide some direction/guidance/light on this issue?

Thanks in Advance :o)

Subject: Defragmentation

Defragmenting Requires Sufficient Space on Your DiskIf the disk that you intend to defragment is already highly fragmented, there may be insufficient free space on that disk to effectively run the defragmentation process. This is because a complete copy of the defragmented file is made in the new, defragmented location before the original clusters are marked free.

Could be that your .nsf files are too large to actually defrag now.

If they every came into the office I guess you could connect their laptops to the network.

Delete the replica’s off the drive.

Defrag the drive.

Then create new replica which should then be saved as a Contiguous file.

Just a thought.

Rgds,

Subject: RE: Defragmentation

Ads,

Thank you for your comments. You have a point. Yet, I have 11 Gb free space. Therefore, shouldn’t that be enough to defrag these files?

database1: 416 mb

database2: 2.2 gb

database3: 3.3 db.

Therefore, I should have enough. When I look at the fragmented parts and the available parts in the defrag window, it appears that there is enough space. I realize that this statement might be an idiotic one… my apologies.

Anyway, any other thoughts?

Thanks in Advance

Subject: RE: Defragmentation

Yeah, you’d think that having 11 GB free, with substantial continguous unused space available would be enough to allow a de-frag of a 3.3 GB file, but… sometimes the windows defragger is just brain-damaged.

Use a better defragger, or try this: copy your data directory (or ust the the large files) to another drive, then delete them; then defrag, then restore (preferably restoring the largest files first); then defrag again.

Subject: RE: Defragmentation

Hi All,

Thank you all for your comments and ideas. I downloaded PerfectDisk and it defragmented my laptop. I guess the makers of windows needs to defintely improve their defrag module.

Thanks Again! :o)

Subject: What’s the real problem?

Please don’t take offense, but is there an issue with performance or that you just want all the files to come up 100% defragged on the report? Our Windoze OS guys and the NOC come up with stuff like this all the time. Newly hired “certified” Notes Admins come and go and say “tweak this and that”, but, remember that rule that Grandpa taught you (no… it is not “If you can’t fix it , don’t break it.”)… Is there really an issue and can you show that the change results in any measureable difference? If not, why do it?

Regular defragmentation (monthly?) is a good preventitive maintenance procedure, but not having 100% of the files 100% contigious should not casue a performance issue.

The OS can deal with the level fragmented files you describe just fine. If there is no measureable, user-preceived performance issue, why try to get it down to that level? The DBs are just as likely to become fragmented again the next time you replicate.

Defragging for defragging’s sake can really be a frustrating waste of time for users, especially with the new “BIG A**” brand disks, but if Windoze made a defragger that was perfect, then all the third party vendors would complain.

Subject: RE: What’s the real problem?

The original poster had a valid defrag concern- it didn’t work.

In regards to defraging, you can’t defrag often enough. There are quite a few issues with a badly fragged files the number one being file corruption and poor disk space utilization.

Subject: RE: What’s the real problem?

I disagree. You can defrag too much. That’s especially true if you are aggressive about compacting, too. Why? Because if you have active, growing databases and if you clear out their unused internal space and defrag free space in addition to defragging the files, you are most likely pushing the nearest free extent physically farther away from the physical position of the logical end of any given database. As a result, you are hurting your write performance.

(Note: if you are using transaction logging, hurting your physical write permance on database files is not necessarily going to be noticeable, unless it gets really extreme. The remainder of this post probably only applies to those who do not use transaction logging.)

The ideal strategy is actually this: First, have enough disk space to accomodate expected growth of all your most active (i.e., frequently written to as well as read from) databases for an extended period of time. Second: compact all your databases. Third, run agents to expand the size of all the highly active databases by creating many large documents in a hidden view and then deleting them. Next, defrag your disk thoroughly using a good quality tool that properly defrags free space as well as files. This effectively reserves contiguous space for your databases to grow. Do not compact and do not defrag again. Wait until the highly active databases start to outgrow the space you have reserved for them, and at that point repeat this entire process.

Subject: RE: What’s the real problem?

Client side, defragging is a good thing to do frequently. Overall file system performance is improved.

I do see the case for server side pre-allocation of disk space. It would ask make a great performance case study.

Mike Robinson

http://www.invcs.com

Subject: RE: Defragmentation

Free space is not necessarily the only concern – the space needs to have a contiguous chunk big enough to handle the files as well. The built-in defragger is horrible at creating contiguous white space – I find that my white space is more fragmented after defragging than before at least half of the time. A third-party defragger would probably do a better job. (The old FAT defragger always did.)

Subject: Try PerfectDisk

Hi Jason, I’m not sure if you posted once before on this topic (as the scenario sounds very very familar). In any event, without re-hashing all the other recommendations, try a product called http://www.perfectdisk.com. I found it does a better job of defragging that the built in defragger. The built in XP defragg is good but I found it does a nice job and had a noticeable speed up in my file system. At least if that seems to work you could draw a conclusion that may be XP-defrag is the problem.

Mike Robinson

http://www.invcs.com

Subject: Defragmentation, my observations

In a similar environment to what you describe I have found that:Notes replication of active databases quickly fragments the disk

Defragmenting is less effective is you don’t compact databases first

There is a small but noticable performance improvement when we defragment.

I use ncompact from the command line for convenience but if there are only one or two databases to compact (e.g. just mail) then compacting from within the client should be equally effective.

I agree that third-party defragmenting tools will give better results than what’s build into Windows, but I would think the advantage of compacting first would remain.

David Schaffer

Subject: Defragmentation

  • how much ram do your laptops have? the sys reqs for notes 6.x is 256MB for stand alone use and 512MB for multiple apps opened.* have you compacted the local databases?

  • have you confirmed it’s definitely local performance and not the link. if it’s the link you need to review how you’re using the notes databases i.e. local vs remote access.

if you haven’t seen this already check this ibm technote out: http://www-1.ibm.com/support/docview.wss?uid=swg27006235 (Tips for Enhancing and Troubleshooting Notes 6 Client Performance)

Subject: RE: Defragmentation

Hi Mark,

I have 700 Mb memory, I have 34% diskspace left on a 30Gb disk (11Gb).

I have three databases that do not defrag. So, I performed an in-place compaction on th “local” databases (option -B). Ran windows defrag, and it still doesn’t want to compact these particular databases. The size of the databases are:

database1: 416 Mb

database2: 2.2 Gb

database3: 3.3 Gb

The disk space that I have left over (11.7 Gb) should be enough to defrag these databases, no?

I will take a look at the link you sent and see what I can find here. Other than that, do you have any other ideas?

Thanks in Advance :o)

Subject: RE: Defragmentation

so the users are using local replicas for mail and apps. how are they resolving names? directory catalog?

what exactly are they complaining about?

  • is it the speed of emails being sent or received?

  • access to local databases?

  • the speed of replication between local and server replicas of the databases?

if you could tell us what the function the databases perform i think it would help greatly in troubleshooting.