How do I prevent xargs from running multiple commands?

I operate on lists of arguments of unknown total lengths in the format of find /home -type f -size +512k -print0 output. The order of the arguments is significant.

I need to pass them in a command line such as what gets invoked by piping that to xargs -r0 -- "${DANGEROUS_COMMAND[@]}" --log-file="${LOG_FILE_PATH}" --dry-run --
But I need to guarantee that "${DANGEROUS_COMMAND[@]}" is executed exactly once, with every single argument or if the list was too long or empty we need to gracefully abort before running anything.

It needs to be safe to run a command first with a dry-run argument (subsequent invocations would destroy the previous invocation’s log file), then run tests on the log file, and only if those all passed too to finally run the command again without the dry-run argument. I need to be able to trust that the log file reflects the whole list!

For weeks, I have erroneously thought that the

-x, --exit
       Exit if the size (see the -s option) is exceeded.

switch in the XARGS(1) man page does this, but it only seems to do anything if a single input argument is too long, otherwise xargs still splits the arguments in multiple invocations of the command and I have almost certainly erroneously deleted something because of this. 🙁

It seems xargs does not provide a way to do this.
Is there a similar tool that does what I described or is writing something custom the only way to do this?

>Solution :

The simplest there is should be working fine.

readarray -t -d '' arr < <(find /home -type f -size +512k -print0)
if ((${#arr[@]})); then
    if "${DANGEROUS_COMMAND[@]}" --log-file="${LOG_FILE_PATH}" --dry-run -- "${arr[@]}"; then
        "${DANGEROUS_COMMAND[@]}" --log-file="${LOG_FILE_PATH}" -- "${arr[@]}"
    fi
fi
    

Leave a Reply