Grep is an amazing tool to search through log files and pull out useful information, but what if you want to search a log file using a giant list of keywords from another file? Luckily it has that feature built in as well.

This happened to me when I wanted to pull a list of all URLs that were requested by a huge block of IPs that was abusively attacking our server. After identifying and creating the list of a thousand IPs, I needed to pull the URLs from the main log file to identify the most requested resources.

To do this you'll want to use the -f argument, which allowes you to specify a file for the list of patterns to search for.

-f FILE, --file=FILE
    

Obtain patterns from FILE, one per line. The empty file

contains zero patterns, and therefore matches nothing.

(-f is specified by POSIX.)

Assuming your set of keywords or strings is in a file named "searchstrings", you can use the argument on the command line like the following example. Since this search is going to generate a ton of data, the "> output.txt" part of the command sends the result of the command into a file called output.txt that can be analyzed separately.

grep -f searchstrings filetosearch > output.txt

The only issue with using the -f argument is that grep is going to attempt to interpret the keywords as if they are patterns, which can slow it down when parsing against an extremely large file. So you can also specify the -F parameter, which tells grep to only do exact matches against the strings.

 -F, --fixed-strings
    

Interpret PATTERN as a list of fixed strings, separated by

newlines, any of which is to be matched. (-F is specified by

POSIX.)

So the full command would end up being more like this:

grep -F -f searchstrings filetosearch > output.txt

Grep is a ridiculously powerful way to search log files, so it would be well worth your time to look through the man file.