Reviews and Ratings

Life-changing  
5.0
 
written about 16 years ago

My life changed since when I discovered netcat.

This tool is very simple. Really, too damn simple. But it is incredibly useful.

netcat... cat over network... stdin data goes to the network, incoming network data goes to stdout... That's it!

Supports a client mode (connects to some host at some port) or server mode (listens to some port waiting for connection). Supports TCP and UDP (but most of the time you will want to use only TCP).

I've been using netcat for many things:
1. To copy files from one computer to anoter.
2. To copy an entire hard drive from one computer to another.
3. To copy/paste small snippets of text from one computer to another (and directly inside vim).
4. To test DNS.
5. To test network bandwidth/throughput.
6. ...

There is one very important difference between using netcat (as client) and telnet client: netcat preserves every byte that passes through it, while telnet automatically converts newlines and handles some escape codes. This means that netcat works great for binary data, and telnet works as a low-level HTTP/SMTP/POP3 debugging tool.

1 out of 1 users found the following review helpful.
Did this review help you? |
Find and xargs  
4.0
   
written about 16 years ago

'find' is a great tool, really powerful and useful.

'xargs', however, looks like a promising tool but I rarely use it because it is cumbersome to use. Basically, since the default behavior reads items separated by whitespace, 'xargs' is very difficult to use with filenames that might contain spaces. I know there must be some set of parameters to fix this, but then its use becomes less straightforward. Even the manpage recognizes this: "Because Unix filenames can contain blanks and newlines, this default behaviour is often problematic"

For this reason, I've never used xargs in years.

If I need to run something for multiple files, I do this (in bash):

for a in * ; do something "$a" ; done

Which is easier to read and to write (although slightly verbose).

Another option is this:

some | commands | here | while read a ; do something "$a" ; done

Finally, I can also just use the -exec parameter from 'find'.

0 out of 1 users found the following review helpful.
Did this review help you? |
Good tools, awful documentation  
3.0
   
written about 16 years ago

I guess basically all advanced users have used one of the imagemagick tools at least a few times (convert, identify, import, display...), and I guess all of them share the same feeling: the tool is great and very powerful, but lacks a good documentation.

Anyone who has ever tried to seriously use 'convert' has undoubtly spent a lot of time trying to find *some* documentation. Eventually, a tutorial page is found via Google, but the example in that tutorial might not work (probably because that tutorial is outdated). Some time later, finally the official documentation at ImageMagick's site is found, but it doesn't mean this documentation is easy to use.

Eventually, people either learn a 'magick' obscure command-line that does the trick, or they just stop wasting time trying to figure out how to use 'convert' and do what they need using any other software.

0 out of 2 users found the following review helpful.
Did this review help you? |
Almost there...  
2.0
   
written about 16 years ago

The main feature of curl, for me, is the ability to submit forms using the POST method. This means I can automate some things by script.

Another little nice feature is the interval syntax (like http://www.example.com/photo_[01-32].jpg).

However, the curl command-line syntax and behavior are not very good. If you just call curl, it will print progress info to the terminal, and will also send the output to stdout. In my opinion, it should, by default, save the file on disk. And, well, if I ask curl to save file on disk, I must tell it to use the same filename of the original file (-O option). If I pass multiple URLs, I must repeat this -O option for every one of them.

My complaint about this is that the most common use case requires passing many command-line options. In my opinion, curl should have sensible defaults.

And this is why I use wget most of the time: wget just works and by default will do what I want.

And I use curl only inside scripts to either download multiple files (but even then it is a pain) or to submit POST forms.

1 out of 5 users found the following review helpful.
Did this review help you? |