I’ve written a find function that searches for a string in each file in a given path, while skipping a list of directory names that I don’t want searched. I’ve placed this script in my .bashrc file to be called like so: The find portion works great, and it colorizes the search text so that it visually stands out!, but
Tag: find
Paths must precede expression Linux find
I have the following command: I get the error “find: paths must precede expression: 1990” Why is this? I have quoted the wildcard, so I don’t see the error. How to fix this? Thanks! Answer You need to repeat the -iname argument:
Is it possible to reliably nest find -exec commands in linux?
I need to fix permissions on a large number of files from a subset of directories on a large nfs volume. In building up a solution, I started by using find to get a list of the directories I want: The directories all start with a number. I can successfully use this to exec a 2nd find. That find gets
Linux or unix find: shortcut for matching partial name
What I usually type: what I want to type (for example. mnemonic = “all”): or I was just thinking there could be a shortcut like in the style of find . -iname “blah” or rgrep instead of grep -r Even though, like the grep example, it only saves a couple of characters, they are shifted characters not as easy to
Exclude list of file extensions from find in bash shell
I want to write a cleanup routine for my make file that removes every thing except the necessary source files in my folder. For example, my folder contains files with the following extensions: .f .f90 .F90 .F03 .o .h .out .dat .txt .hdf .gif. I know I can accomplish this with: Using negation, I can do this: But, when I
How to display a few line from the last few lines for a given number of files?
I have a folder with many different files. I want to display a specific line from the last 100 line of a list of files. So far I have tried both Grep and Tail but neither gave me exactly what I want: Example: the folder has the following files: file_1.txt file_2.txt file_3.txt other_file.txt other_file2.txt Content of file_n.txt is: line 88:
Tar search results in .sh file
I have to tar a list of files, without path, that is a result of a find via sh (for crontab use). In ubuntu’s shell each command works fine but in .sh not. I tried with : And also with But both failed. May someone help? Alternatives ? Additional scenario info: /myfolder/ contains: one1.log one2.log one3.log two1.log two2.log I want
Combining find command conditions
I am trying to combine 2 find commands to pipe to a grep pattern match. my 3 commands are: get files modified in the last 24 hours: ignore a couple of directories: find pattern in the file’s lines: I have gotten below to work but cant seem to add directory skipping into the mix. get files modified in the last
How to use content of variable in bash to pass to find in script?
In a bash script I’m trying to use the content of a variable to be used as options for the find command: The aim is to set proper access rights as for what ever reason quite some files (such as pictures or office documents) are marked executable. So in reality the script above has several arrays with file extensions concatenated,
getting the output of “find” in a CSV
I’ve got a project where I’m trying to collect all the files on a linux system into a CSV to be dumped into a DB. my script has But the problem is that I have filenames %f and directory names %h with commas in them. So then I thought I could just put each item in quotes and that won’t