summaryrefslogtreecommitdiff
path: root/sysutils/agedu
diff options
context:
space:
mode:
authorrodent <rodent>2013-04-07 20:49:31 +0000
committerrodent <rodent>2013-04-07 20:49:31 +0000
commitda9d557ccd4313490cdcaabf97f4bce19df13498 (patch)
tree79de035a23bbb285c8713b492fa4fa499634f7f8 /sysutils/agedu
parentc0be38a4f5e5d44fa50008152988b43e17259963 (diff)
downloadpkgsrc-da9d557ccd4313490cdcaabf97f4bce19df13498.tar.gz
Edited DESCR in the case of:
File too long (should be no more than 24 lines). Line too long (should be no more than 80 characters). Trailing empty lines. Trailing white-space. Trucated the long files as best as possible while preserving the most info contained in them.
Diffstat (limited to 'sysutils/agedu')
-rw-r--r--sysutils/agedu/DESCR46
1 files changed, 20 insertions, 26 deletions
diff --git a/sysutils/agedu/DESCR b/sysutils/agedu/DESCR
index cd7823b2a44..ed74862ddf6 100644
--- a/sysutils/agedu/DESCR
+++ b/sysutils/agedu/DESCR
@@ -1,30 +1,24 @@
-Suppose you're running low on disk space. You need to free some
-up, by finding something that's a waste of space and deleting it
-(or moving it to an archive medium). How do you find the right
-stuff to delete, that saves you the maximum space at the cost of
-minimum inconvenience?
+Suppose you're running low on disk space. You need to free some up, by finding
+something that's a waste of space and deleting it (or moving it to an archive
+medium). How do you find the right stuff to delete, that saves you the maximum
+space at the cost of minimum inconvenience?
-Unix provides the standard du utility, which scans your disk and
-tells you which directories contain the largest amounts of data.
-That can help you narrow your search to the things most worth
-deleting.
+Unix provides the standard du utility, which scans your disk and tells you which
+directories contain the largest amounts of data. That can help you narrow your
+search to the things most worth deleting.
-However, that only tells you what's big. What you really want to
-know is what's too big. By itself, du won't let you distinguish
-between data that's big because you're doing something that needs
-it to be big, and data that's big because you unpacked it once and
-forgot about it.
+However, that only tells you what's big. What you really want to know is what's
+too big. By itself, du won't let you distinguish between data that's big because
+you're doing something that needs it to be big, and data that's big because you
+unpacked it once and forgot about it.
-Most Unix file systems, in their default mode, helpfully record
-when a file was last accessed. Not just when it was written or
-modified, but when it was even read. So if you generated a large
-amount of data years ago, forgot to clean it up, and have never
-used it since, then it ought in principle to be possible to use
-those last-access time stamps to tell the difference between that
-and a large amount of data you're still using regularly.
+Most Unix file systems, in their default mode, helpfully record when a file was
+last accessed. Not just when it was written or modified, but when it was even
+read. So if you generated a large amount of data years ago, forgot to clean it
+up, and have never used it since, then it ought in principle to be possible to
+use those last-access time stamps to tell the difference between that and a
+large amount of data you're still using regularly.
-agedu is a program which does this. It does basically the same sort
-of disk scan as du, but it also records the last-access times of
-everything it scans. Then it builds an index that lets it efficiently
-generate reports giving a summary of the results for each subdirectory,
-and then it produces those reports on demand.
+agedu does same disk scan as du, but also records the last-access times of
+everything. Then it builds an index that lets it efficiently generate reports
+giving a summary of the results for each subdirectory.