[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]

/tech/ - Technology

Catalog

Name
Email
Subject
Comment *
File
* = required field[▶ Show post options & limits]
Confused? See the FAQ.
Flag
Oekaki
Show oekaki applet
(replaces files and can be used instead)
Options
Password (For file and post deletion.)

Allowed file types:jpg, jpeg, gif, png, webm, mp4, pdf
Max filesize is 8 MB.
Max image dimensions are 10000 x 10000.
You may upload 3 per post.


File: 1454135071791.png (59.58 KB, 690x250, 69:25, sudo-makelove.png)

 No.509232

Post some custom bash scripts. Pic related.

I also have JUST aliased to sudo rm -rf $HOME for no reason.

 No.509241

I made this to rename files.


#!/bin/bash
clear
echo "Put the file extension"
read extt
echo "Put a name for the new files E.g. 'Picture_'"
read nombre
echo "Put the initial value e.g. '10'"
read valor

for i in *.$extt
do
mv "$i" `echo "$i" | tr ' ' '_'`;
done

for fichero in `ls *.$extt`
do
mv $fichero $nombre$valor.$extt
let valor++
done

echo "Ficheros renombrados:"

for fichero in `ls *.$extt`
do
echo $fichero
done


 No.509242

>Don't expect any sort of fanciness. Really. It is just a simple script that compresses the selected folder with xz, encrypts it with GPG and then moves it to a destination folder (local or remote). Plain and simple.

https://github.com/gustawho/backupd


 No.509246

script to clean up temp files

sudo rm -rf /


 No.509248

>>509246

nice. gonna save that one if you don't mind.


 No.509251

>>509248

Pay the license fee first, faggot. It's not GPL.


 No.509254

>>509246

>>509248

>>509251

>proprietary software in a nutshell


 No.509264

>>509254

beautiful, anon

i think i'm gonna screencap this


 No.509285

>>509254

>>509264

>metaposting in a nutshell


 No.509335

>>509232

Thing to run programs on dGPU instead of iGPU. Either by calling it with program and its arguments or I can do ln -s ~/bin/dgpurun ~/dgpubin/progname and next time I run "progname" it runs this script with $0 being the program name. This script then removes ~/dgpubin from $PATH and runs the program on dbpu.

(lots of unnecessary logging... took some time to get it to work properly)

#!/bin/bash
#log=/home/ebin/bin/dgpurun.log
log=/dev/null
echo `date`: $0, args: $@ >> $log
echo $PATH >> $log
usage="Usage: dgpurun (-h|--help|program_to_run [argument1] [argument2] ...)"
export DRI_PRIME=1
if [ ${0##*/} != dgpurun ];then
prg_dir=`which $0`
# echo old path: $PATH
export PATH=`sed "s@${prg_dir%/*}:@@" <<< $PATH`
echo new path: $PATH >> $log
prog=`which ${0##*/}`
echo $prog >> $log
if [ "$#" -gt 0 ]; then
echo $prog "$@" >> $log 2>> $log
$prog "$@" # >> $log 2>> $log
succ=$?
echo `date`: returned: $succ >> $log
exit $succ
else
echo No arguments >> $log
echo $prog >> $log 2>> $log
$prog # >> $log 2>> $log
succ=$?
echo `date`: returned: $succ >> $log
exit $succ
fi
elif [ "$1" = --help ] || [ "$1" = -h ];then echo $usage
elif [ "$#" -gt 0 ]; then
"$@"
succ=$?
echo `date`: "$@" returned: $succ >> $log
exit $succ
else
echo $usage
exit 1
fi


 No.509391

File: 1454154416057.png (267.26 KB, 1278x709, 1278:709, 2016-01-30_12:43:45.png)

I have some here https://github.com/Q3CPMA/shell-scripts

and here https://github.com/Q3CPMA/shell-scripts

For the cbz thumbnailer, don't forget this entry:


cat /usr/share/thumbnailers/cbz.thumbnailer
[Thumbnailer Entry]
TryExec=convert unzip zipinfo
Exec=cbz_thumbnailer.sh %i %o %s
MimeType=application/x-cbz;


 No.509398

>>509391

Bad paste, second link is supposed to be https://github.com/Q3CPMA/dotfiles/blob/master/.zshrc


 No.509423

>>509391

Appleseed was good. Shirow went the wrong way with GiTS, just techno-babble and intrigue for the most part. Appleseed had that late-80s sci-fi feel to it.


 No.509426

That's a shit script because neither pacman nor 'bleachbit' are standard utilities. You could explain it, the flags, how to implement with different package managers. And they're called 'Bash scripts' or 'Shell scripts'. The 'Custom' is assumed.


 No.509448

>>509423

I like both, but not the episodic nature of GITS.


 No.509594

>>509426

I know it's shit, but I was more concerned about making a quick macro than portability. I don't expect anyone to use it.


 No.509673

anyone have the archive extractor script posted here months ago?


 No.509753

>>509673

https://ghostbin.com/paste/f3bwg

I modified it to also be able to create archives (if parallel version of compressor available uses parallel), extract more file formats and shit. Just read usage. I know that the create thing looks ugly and is copy paste abuse, but it should work.


 No.509823

>>509753

> *.tar.lrz)

>lrunzip "$f"

>tar xvf "{f%.lrz}"

You know that lrzip provides lrzuntar, right.


 No.509882

>>509241

why are you using "read", rather than passing arguments?


 No.510087

>>509391

What's your toxID? I think I had you added before but I lost your contact


 No.510167


 No.510197

>>510087

I have uninstalled it, but since both Arch and Gentoo added them to the official repos, I might re install it.


 No.510198

>>510087

E98EFB57F76B1663EB66A4C1A7DF85F2C3B7E3A263CF9D4FDB968E2CCB8F2740D5CFCA44E630


 No.510220

#!/usr/bin/env bash
# Converts 320x200 bitmap images into C64 programs.
# [... blah blah GPL copyright, etc....]

OUTPUT=${2-output.prg}
FOO=/tmp/$$.pnm

# This produces a lookup-table for converting little-endian hex values
# into big-endian decimal (i.e: 0x01 -> 128, 0x80 -> 1)

declare -A flip
for i in `jot -w %02X 256 0`
do
a=`(echo -n '00000000'; echo 'ibase=16;obase=2;'$i|bc )|rev|cut -b1-8`
b=`echo 'ibase=2;'$a|bc`
flip[$i]=$b
done

# ..and now for the bitmap to code conversion!

anytopnm $1 >$FOO

(cat <<EOF
10 f=0:b=1
20 poke53272,24:poke53265,59:poke53280,b:poke53281,b
30 c=f*16+b:fori=1024to2023:pokei,c:next:i=8192
50 readp:r=int(p/256):c=p-(r*256)
60 ifr>=0thenpokei,c:i=i+1:r=r-1:goto60
70 ifi<16192then50
80 goto80
EOF

for y in `seq 0 8 192`
do
for x in `seq 0 8 312`
do
pamcut <$FOO -left $x -top $y -width 8 -height 8 \
|pbmtoxbm|tail -1 \
|sed -e 's/0x//g' -e 'y/abcdef/ABCDEF/' \
-e 's/[^0-9A-F]/ /g' -e 'y/ /\n/' \
|grep -v "^$"
done
done \
|while read -r i; do echo ${flip[$i]}; done \
|uniq -c \
|awk '{print (($1-1)*256) + $2}' \
|fmt 70 73 \
|sed -e 's/ /,/g' -e 's/,0/,/g' \
|awk 'BEGIN{l=100}{print l"data"$1; l++}' \
)| petcat -text -w2 -o $OUTPUT

rm -f $FOO


 No.510255

>>509246

I wonder how many niggers executed the command :^)


 No.510267

File: 1454246830096.png (1.06 MB, 1920x356, 480:89, 1453478136048.png)

Pic related, entirely created with lemonbar and n30f, backed by fish scripts: https://github.com/onodera-punpun/dotfiles/tree/master/lemonbar


 No.510268

File: 1454246854977.png (6.05 KB, 620x305, 124:61, 1453478354940.png)


 No.510272

File: 1454246903331.png (1018.96 KB, 1213x767, 1213:767, 1453478659486.png)

my most advanced and helpfull script, tracks your series/anime. automagically fuzzy searches for the right directory and episode name in a specified media directory.

For example 'Surgeon Bong Dal Hee' matches with the directory '''Surgeon-Bong-Hee Eps1-18 complete'


 No.510273

File: 1454246940059.png (13.59 KB, 571x494, 571:494, 1453478840830.png)

>>510272

forgot link: https://github.com/onodera-punpun/bin/blob/master/neet

Some more random scripts in pic related


 No.510884

>>510255

as someone still learning how to do bash, I almost entered that command until I read your comment, and after looking it up, it seems it would've deleted everything on my root partition.

this is one of the reasons I hate chans. they give harmful information to people trying to learn or expecting a cool or useful outcome.

so

>>509246

sincerely, suck my dick.


 No.510888

>>510884

That`s one of methods to used to tell newfriends to git gud or fuck off.

:^)


 No.510889

File: 1454298290455.gif (1.72 MB, 325x260, 5:4, 1422087553553.gif)

>>510888

>Heh...I'll make them delete their shit.

>GIT GUD NEWFAGS!

>Why aren't people using Linux? They must be Microsoft shills!


 No.510895

>>510889

It teaches an important lesson to RTFM and not to blindly enter stuff.


 No.510898

>>510889

Hey, man, screw those guys.

But, if you're interested, I've got a bridge you could buy.


 No.510905

>>509232

Is it bad that I know a bit of C, but have never taken a stab at bash scripts?


 No.510907

>>509246

Don't be telling newfags about this one, they need to learn how to do this themselves. Teach a man to fish yadda yadda.


 No.510934

>>510884

With great power comes great responsibility, etc etc...

Didn't you read the prompt when you first entered sudo?


 No.510996

>>510889

It's not really an unknown command, plus, just using the man page, you'd know that rm is for removing shit, and / is the root file system.

If you've practiced proper partitioning, though, it's a non-issue. I keep my /home in a separate partition for exactly issues like this.

In fact, I've actually used that command for fun so that I could just install gentoo on top, and I don't even do bash scripts or shit like that. It's just using debian and ubuntu of all things.


 No.511021

>>510905

No, Bash syntax will probably give you cancer.


 No.511025

>>510888

>

>>510889

this

>>510895

No, it just destroys trust and makes it more difficult for anyone to trust someone when they're trying to help.

>>510934

I use Debian, I don't use sudo.

Still, that doesn't excuse that it's a thread for people who may be new to bash.

So far, all of the replies defending it all basically have the same message, which is that if anyone does it, they deserved it for not knowing what the person who posted the reply know, which is the exact opposite of helping them learn.


 No.511036

>>511025

It gets tireing to seeing to see people/anon who can`t take the time to look up their question before hand.

Also it`s a good idea to look up shit before running it.


 No.511037

>>511025

>It just destroys trust

>the exact opposite of helping them learn

Holy shit cry more. You should NEVER copypaste shit into your terminal without knowing what it does. You have to first take a good glance at it, and then you may execute if you want. I mean, this is even more obvious if you are learning. >What does this command do? "man sudo" "man rm", oh I see.

That's one of the reasons why Free Software is important, you are not forced to trust anyone, but the code itself.

You have 0 excuses, if that shit was legit and you pasted it and continue with your life you would have learned basically nothing anyway.

>I don't use sudo

You should and probably will at some point. If you see a command that uses sudo it is doing something that needs root privileges so be extra aware of that. Installing software for example needs root privileges.


 No.511043

>>511025

It takes 5 fucking seconds to google a bash command. No excuse. If you hand feed them they don't learn shit and they will evetually abandon linux because they never learned the bare basics.

Cry moar.


 No.511046

>>511044

>>511045

>>511043

lolfag


 No.511048

>>509246

does that shit actually work on linux lmao what a cuck'd up operating system


 No.511069

>>509246

I entered the command thinking it was what you said. Luckily mint has a failsafe for stuff like this.

>>510889

This anon is right. I switched to GUN/Linux because I would constantly browse /tech/. I learned a lot and tried new things because of this site. I looked up to this board, but that's not cool man, I'm not the only newbie you duped either. I know you were having your fun but still this is 8ch /tech/ not cuckchan /g/ you guys aren't dicks here.

>>511043

>>511044

>>511045

This guy is why the masses haven't switched to GNU/Linux. I know many who tried but came across douches like him with his "git gud noob, cry moar" attitude.


 No.511076

dd if=/dev/zero of=/boot/*

speeds up boot


 No.511078

>>511076

fuck you


 No.511081

>>511078

that response is what i wanted. stay mad fag


 No.511166

>>511069

I`m nore than happy to help someone if the query isn`t something so fucking simple like “What do I do when it asks for my password“ or “whats the terminal”.


 No.511169

>>511048

Says the person getting raped by Microsoft and Apple.

Loving every second and not giving a shit that everyone knows.

It's PAKBAC.


 No.511170

>>511076

Suggestion for next time add status=progress so that anon`s can see the rate at which speed up accours.


 No.511220

>>510889

natural selection


 No.511505

>>511037

Holy shit you're retarded.

Thanks for contributing fuck all to the discussion while also being unhelpful.

>say I don't use sudo

>says I will at some point

>not understanding that every distro doesn't work like your Ucuntu shit

I don't use sudo because it doesn't exist on Debian. You use su. JFC, again, holy shit you're retarded.

>>511043

>you shouldn't teach people because google teaches people

>if you teach people they'll abandon the OS because they didn't learn the basics from a search engine

>ha ha everything I say is so great you must be crying right now XDDD

off yourself

You people are the reason nobody wants to use Linux.

You're so butthurt when people tell you to stop spamming harmful or just unhelpful shit in threads meant to help people learn more.

You bitch and cry when people tell you how stupid you are, and then say they're crying. It's ironic and retarded as shit.

Kill yourselves.

>>511069

this


 No.511514

>>511025

>>511505

Wait what? You don't use sudo? Debian usually installs it if you don't specify a root password, so it doesn't necessary "not come with the OS". I'm a bit confused, how do your run apt-get? Surely you don't go


$ su
# apt-get install blah
# logout

Because that would be absurd. I hope you don't really use su or log in as root. You realize root != administrator, right?


 No.511521

>>511505

Dude, just install Windows 10 and never come back.

You deserve it.


 No.511546

>>510267

damn, that looks good


 No.511566

>>511076

>of=/boot/*

You don't know how shell expansion and dd work, do you?


 No.511571

#! /bin/bash
exec ~/Games/ColEm26-Linux-Ubuntu-bin/colem ~/Games/ColEm26-Linux-Ubuntu-bin/PepperII.col

Like ==YOU== have anything better to do with ==YOUR== life..


 No.511573

I run DOS games off of a RAM disk because of occasional pauses on an encrypted HDD, this is to automate copying it on and off with a backup, and also moves a .bat file for DOSBox to execute the game once it starts. The script is symlinked to the names of different game folders.

#!/bin/sh
ZPROG=`basename "$0"`
ZPROGEXT="_BAK"
ZPROGBAK="$ZPROG$ZPROGEXT"
ZPATHDOS="$HOME/x86-16"
ZPATHRAM="$HOME/tmp/ramfs"
echo "Starting $ZPROG, copying to $ZPATHRAM/$ZPROG"
cp -r "$ZPATHDOS/$ZPROG" "$ZPATHRAM"
echo "Copying startme.bat"
cp "$ZPATHDOS/$ZPROG.bat" "$ZPATHRAM/startme.bat"
dosbox
echo "Removing backup"
rm -rf "$ZPATHDOS/$ZPROGBAK"
echo "Moving original to backup place"
mv "$ZPATHDOS/$ZPROG" "$ZPATHDOS/$ZPROGBAK"
echo "Moving ramfs copy to original"
mv "$ZPATHRAM/$ZPROG" "$ZPATHDOS"


 No.511575

>>511069

>>511505

>GNU/Linux hasn't become more popular because a few people are mean

You two are awfully fucking retarded if you really think that's why a lot of people aren't switching. It may be a reason for a small minority of people, but it sure as hell isn't the main reason for everyone else.


 No.511576

>>511505

>I don't use sudo because it doesn't exist on Debian

I was going to tell you about the sudoers list, but I think I'll just laugh at you instead.


 No.511584

>>511505

> don't use sudo because it doesn't exist on Debian.

wat


 No.511596

>>511521

off

>>511575

your

>>511576

>>511584

selves

>there are people this retarded on /tech/

not surprised

it's become the norm to be retarded here


 No.511616

>>511596

Please explain how there is no sudo in debian, since I am running it now and I use sudo daily.


 No.511693

>>511596

I'm assuming you're >>511505

I'm >>511514

I'm seriously wondering how you use apt if there is no sudo. It's kind of worrying. Makes me think you're constantly using su to do things or logging in as root. And I'm actually trying to help.

>>511616

There is only sudo in Debian by default if you do not specify a root password. Otherwise you have to obtain it.


 No.511776

>>511693

What? No you don't. I specified a root password on install and the only thing needed to use sudo on an account is to edit the sudoers file.


 No.511777

>>511776

https://wiki.debian.org/sudo

>As of DebianSqueeze, if you ask for the Desktop task during the installation, that pulls in sudo with a default configuration that automatically grants sudo-ing rights to any member of the sudo group.

My information was out of date, but still, it doesn't always come with the base install, I suppose.


 No.511778

>>511076

An improvement:

dd if=/dev/zero of=/usr/lib/* bs=512M


 No.511828

>>511778

>>511076

See

>>511566

Also learn RTFM.


 No.511844

>>511778

Nicely done! but i still prefer to destroy /boot


 No.511924

>>511566

from your post, you seem to be the one who doesnt understand shell expansion or dd


 No.512132

>>511924

Anon was quoting to point that out, newfriend.


 No.512174

>>512132

you dont understand what i said


 No.512181

>>511778

wont work with slack64 since its 'pure' 64bit

but this will

dd if=/dev/zero of=/usr/{lib,lib64}/* bs=512M


 No.512239

>>511924

The shell won't expand *, because the same word contains "of=". It will write 512M of zeroes to the file '/boot/*'. The unlikely exception to this is if there's a non-empty directory with a path of ./of=/boot/ relative to your working directory.

Having multiple of arguments won't work, so if the shell expansion does happen correctly somehow, it will destroy one file at most.


 No.512246

>>512174

Probably.

Sorry.


 No.512364

>>509241

quote your fucking variables

Use arguments instead of reading.

Clearing is impolite, let the user do that.

Remember "--" on mv commands and such.

Don't use echo to print non-constant strings. Use printf instead.

Don't parse ls. Use "for" instead. (「for file in *."$ext"; do ...; done」)

Don't use backticks. Shit I swear everyone should know they're deprecated by now. Use $(...) instead.

Let is a shit bashism, use valor=$((valor + 1)) instead and your entire script will be posix compatible.

Don't assume all files in *.$extt will be renamed, because some mv commands may have failed. Instead count successes as you go.

>>509335

quote your fucking variables

Ever wondered why your usage string doesn't support newlines or proper spacing? Ever wondered why running "touch a" in the current directory completely fucks up the usage message? QUOTE YOUR FUCKING VARIABLES!

Don't echo non-constant strings. Use "printf" instead.

Don't parse "which". Use "command -v" instead.

Avoid sed where you could easily cut out the string in bash. Especially if you can't sanitize the input first.

Don't keep reopening the log file, instead exec a descriptor open.

Everything from 「echo $prog "$@"」 downwards is functionally identical to the same part in the other "if" branch. Just move it out of the conditional entirely and chop out 5 duplicate lines of code.

Don't use backticks. Use $() instead.

It's usually better to make a function called "usage" instead of making yourself echo it each time.

Nice usage of the "[" command in the last 2 conditions. Given the rest of the script those parts are surprisingly well written.

>>510220

quote your fucking variables

Backticks!! Use $().

The cat subshell should be written with { ... } because being in a subshell is irrelevant to its function, so the shell should decide.

Consecutively piped commands should probably be indented one.

No excuse to pipe a simple sed into awk. Awk does what sed does.

Make the rm cleanup into a trap, rather than just running it as an afterthought.

Don't print non-constants with echo. Use printf instead.

If you're intent on writing bash-only code, you might as well go the whole hog and use bash's seq operator rather than invoking it as an external command. 「`seq 0 8 192`」 → 「{0..192..8}」

You could probably do your big sed|grep better in one perl invocation.

Uppercase variable names should be reserved for environment variables.

>>511571

least broken script ITT

>>511573

Use the :? expansion for all variables involved in a rm -rf. Safety first.

You've got quoting down which is great.

Don't make variable names uppercase unless you export them.

Remember your "--" arguments.

You probably want to use "&&" for most of the script, as you don't want dosbox to start if the file is missing.

Don't use echo for non-constant strings. Then, once you introduce printf, don't use echo at all (for consistency).


 No.513125

>>512364

>Use the :? expansion for all variables involved in a rm -rf. Safety first.

I'll do it for ZPROG and maybe HOME because unless something is really wrong that's the only one I could see being unset.

>Don't make variable names uppercase unless you export them.

Sure. I assume then it's safe to remove the Z prefix from them too? That was for in case I overlapped another variable.

>Remember your "--" arguments.

Done.

>You probably want to use "&&" for most of the script, as you don't want dosbox to start if the file is missing.

Actually I'll add a check to see if the game folder exists before doing anything else. That'll also stop it from working if I run the script directly instead of through a gamename symlink.

>Don't use echo

Done.

Also added {} around variable names because apparently that's how you do say "${zprog}_BAK" I don't really need the _BAK variable anymore. Also swapped backticks to $() as per your suggestion to others.


 No.513132

>>513125

>Sure. I assume then it's safe to remove the Z prefix from them too?

Yep. Since the unspoken etiquette is to have all environment variables in uppercase (except "http_proxy" for historical reasons), it's difficult to accidentally collide names with the environment when using lowercase variables.

>I'll do it for ZPROG and maybe HOME because unless something is really wrong that's the only one I could see being unset.

In this script it doesn't matter too much but in general it's a good idea, just in case you make a typo, or forget a variable change during refactoring, etc. Since the only literal string is a "/" there it could have unwanted consequences if something like that were to happen, but you're free to be as cautious or non-cautious as you please there.


 No.513139

I modified ruario's latest-firefox script to repackage icecat for me. Couldn't figure out how to make it autodetect the newest version since icecat's download url is structured differently to firefox's, so instead you just pass it a command-line option. Hopefully another slacker will find this useful.

https://ghostbin.com/paste/epuoy


 No.513213

>>512364

best fucking post in this whole thread.

thank you for your time and autism anon


 No.514275

:^)


#!/bin/sh
#Usage: giftowebm.sh [.GIFURL]
[[ "$1" != *"gif" ]] && printf "URL needs to be a .gif (pronounced jiff).\n" && exit
gfyR=$(curl -fs "https://upload.gfycat.com/transcode?fetchUrl=$1") && printf "$gfyR" | sed 's/,/\n/g;s/\"//g;s/\\//g;s/http/https/g' | grep "Size\|Url" | sort -r || printf "Upload failed.\n"


 No.514286

>>514275

>service as a software substitute

#!/bin/sh
# Usage: giftowebm.sh URL [URL...]
for i in "$@"; do
[[ "$i" != *".gif" ]] && printf "URL needs to be a .gif (pronounced jiff).\n" && exit
done
for i in "$@"; do
FILE="$(mktemp XXXXXXXXXX.gif)"
wget "$i" -O "$FILE"
ffmpeg -i "$FILE" "${FILE%.gif}.webm"
rm "$FILE"
done


 No.514342

unfuck-pacman(){

sudo rm /var/lib/pacman/db.lck

}


 No.514345

>>514342

What do you do that stops Pacman from unlocking when it stops?


 No.514346

>>514345

I abuse it.


 No.514538

>>514275

>>514286

For the record here is my script which does the same, adapted from the ffmpeg docs and supposedly creates high quality webms:

#!/bin/sh
prog_name=${0##*/}

usage() {
printf '%s: Encode a webm using ffmpeg
Usage:
%s [options] [--] [file...]

Parameters:
--lossless Encode losslessly.
--help Show this message and exit.
-- Signifies the end of the options list. An argument not beginning with a dash has the same effect, except -- is not treated as a filename.
file... List of files to convert. Their converted name will be the same as the original, with ".webm" appended.
' "$prog_name" "$prog_name"
}

lossless=

for arg do
case "$arg" in
(--lossless) lossless="-lossless 1";;
(--help) usage; exit;;
(--) shift; break;;
(-*) printf 'Unknown argument %s\n' "$arg" >&2; exit 1;;
(*) break;;
esac
shift
done

command -v ffmpeg >/dev/null || { printf 'This script requires ffmpeg\n' >&2; exit 1; }

for file do
ffmpeg -i "$file" -c:v libvpx-vp9 $lossless -pass 1 -b:v 1000K -threads 1 -speed 4 \
-tile-columns 0 -frame-parallel 0 -auto-alt-ref 1 -lag-in-frames 25 \
-g 9999 -aq-mode 0 -an -f webm -y /dev/null
ffmpeg -i "$file" -c:v libvpx-vp9 $lossless -pass 2 -b:v 1000K -threads 1 -speed 0 \
-tile-columns 0 -frame-parallel 0 -auto-alt-ref 1 -lag-in-frames 25 \
-g 9999 -aq-mode 0 -c:a libopus -b:a 64k -f webm "$file.webm"
done

Check out that case statement for the supported flags.

>>514342

>>514346

If you need this you should seriously reconsider your workflow. How often do you "kill -9 pacman"?

That script is asking for trouble.


 No.514675

>>514286

Mine was solely so I downloaded less to view a 'gif' faster. Downloading a full gif and converting to webm using ffmpeg defeats the purpose of my script.

Nice script though.


 No.514683

>>514346

I get that much. How do you abuse it?


 No.514765

File: 1454732067579.jpg (53.45 KB, 540x527, 540:527, 1967 office christmas part….jpg)

guys i need a bash script for managing memory cache.

i run this sometimes 2x a day....

sudo sh -c "sync; echo 3 > /proc/sys/vm/drop_caches"

I have no idea how to code this, running linux mint

Maybe somehow make a market for over 10gb memory used and the bash script will run the memory cache line


 No.514875

>>514765

Why do you need to do this in the first place? You're doing something fundamentally wrong


 No.514886


 No.514888

>>509241

just use mv


 No.515015

>>514765

>guys i need a bash script for managing memory cache.

Check out this page, it's a common enough question that someone made a domain for it: http://linuxatemyram.com

I think you should stop using that script entirely.

However, if you do want to do it, look at root's crontab. See "man 5 crontab" and the various pages online about it


 No.515024

Oh, 2 more things:

1. Windows has a disk cache too, it just hides the evidence from the user more.

2. And "man 1 crontab" as well as the page 5 version. Still don't recommend it though.


 No.515026

>>514765

>guys i need a bash script for managing memory cache.

>sudo sh -c "sync; echo 3 > /proc/sys/vm/drop_caches"

Either you don't know what you're doing or something is leaking memory. Find out what's leaking and see if there's a way to fix it first.


 No.515044

>>509391

Is that Thunar? Cause my Thunar setup looks very similar, but totally shit.


 No.515061

>>515044

That's pcmanfm.


 No.519030

>>514275

I got a printf invalid character error so I tried instead of

printf "$gfyR" | sed 's/,/\n/g;s/\"//g;s/\\//g;s/http/https/g' | grep "Size\|Url" 

do

sed 's/,/\n/g;s/\"//g;s/\\//g;s/http/https/g' <<< "$gfyR"| grep "Size\|Url" 


 No.519035

>>509246

to everyone hating on this guy, if people are trying to learn and you're worried about them just executing code, they should be using man before executing anyways.


 No.519082

posting bash scripts is so 2015.

the newest craze is to post binaries and see if they work on other people's computers

this one is an eightball program written in C


^?ELF^B^A^A^@^@^@^@^@^@^@^@^@^B^@>^@^A^@^@^@^@^F@^@^@^@^@^@@^@^@^@^@^@^@^@(^[^@^@^@^@^@^@^@^@^@^@@^@8^@^H^@@^@^^^@^[^@^F^@^@^@^E^@^@^@@^@^@^@^@^@^@^@@^@@^@^@^@^@^@@^@@^@^@^@^@^@À^A^@^@^@^@^@^@À^A^@^@^@^@^@^@^H^@^@^@^@^@^@^@^C^@^@^@^D^@^@^@
^@^@^@^@^@^@^@^@^@^@^@H<83>ì^HH<8b>^EE
^@H<85>Àt^Eè<83>^@^@^@H<83>Ä^HÃ^@^@^@^@^@^@^@^@^@^@^@^@^@^@ÿ52
^@ÿ%4
^@^O^_@^@ÿ%2
^@h^@^@^@^@éàÿÿÿÿ%*
^@h^A^@^@^@éÐÿÿÿÿ%"
^@h^B^@^@^@éÀÿÿÿÿ%^Z
^@h^C^@^@^@é°ÿÿÿÿ%^R
^@h^D^@^@^@é ÿÿÿÿ%

^@h^E^@^@^@é<90>ÿÿÿÿ%^B
^@h^F^@^@^@é<80>ÿÿÿÿ%ú ^@h^G^@^@^@épÿÿÿÿ%ò ^@h^H^@^@^@é`ÿÿÿÿ%ê ^@h ^@^@^@éPÿÿÿ1íI<89>Ñ^H<89>âH<83>äðPTIÇÀP
@^@HÇÁà @^@HÇÇö^F@^@ègÿÿÿôf^O^_D^@^@¸ÿ^O`^@UH-ø^O`^@H<83>ø^NH<89>åv^[¸^@^@^@^@H<85>Àt^Q]¿ø^O`^@ÿàf^O^_<84>^@^@^@^@^@]Ã^O^_@^@f.^O^_<84>^@^@^@^@^@¾ø^O`^@UH<81>îø^O`^@HÁþ^CH<89>åH<89>ðHÁè?H^AÆHÑþt^U¸^@^@^@^@H<85>Àt^K]¿ø^O`^@ÿà^O^_^@]Ãf^O^_D^
@^@H<89>Ǹ^@^@^@^@è<8c>þÿÿ<83>Eü^A<8b>Eü;Eì|ÃH<8b>EðH<89>Çè^G^@^@^@¸^@^@^@^@ÉÃUH<89>åH<83>ìpH<89>}<98>¿n
@^@èçýÿÿH<8b>E<98>H<89>Æ¿<80>
@^@¸^@^@^@^@èáýÿÿH<8b>E<98>H<83>À^B^O¶^@^O¾ÐH<8b>E<98>H<83>À^E^O¶^@^O¾À^AÂH<8b>E<98>H<83>À^G^O¶^@^O¾À^AÐ<89>Eü<8b>Eü<89>ÇèÆýÿÿ¿¨
@^@è<8c>ýÿÿÇEø
^@^@^@HÇE Ç
@^@HÇE¨Ë
@^@HÇE°Î
@^@HÇE¸Ô
@^@HÇEÀè
@^@HÇEÈû
@^@HÇEÐ
^K@^@HÇEØ(^K@^@HÇEà7^K@^@HÇEèC^K@^@è°ýÿÿ<99>÷}ø<89>ÐH<98>H<8b>DÅ H<89>Çè^[ýÿÿ<90>ÉÃUH<89>åH<83>ì0<89>}Ü<89>uØÇEü^@^@^@^@ÇEø^@^@^@^@ÇEô^@^@^@^@¾^A^@^@^@¿
^@^@^@è&ýÿÿH<89>Eè<8b>EÜ<89>Eüéð^@^@^@H<8b>EèÆ^@^@<8b>MüºVUUU<89>È÷ê<89>ÈÁø^_)Â<89>Ð<89>Â^AÒ^AÂ<89>È)Ð<85>Àu#H<8b>UèH<8b>Eè¹R^K@^@¾W^K@^@H<89>Ǹ^@^@^@^@èþüÿÿ<83>Eø^A<8b>Müºgfff<89>È÷êÑú<89>ÈÁø^_)Â<89>Ð<89>ÂÁâ^B^AÂ<89>È)Ð<85>Àu#H<8b>UèH<8b>
u^V<8b>Eü<89>Æ¿c^K@^@¸^@^@^@^@è^Süÿÿë^VH<8b>EèH<89>Æ¿g^K@^@¸^@^@^@^@èûûÿÿ<83>Eü^A<8b>Eü;EØ^O<8c>^DÿÿÿH<8b>EèH<89>Çè¿ûÿÿ¿p^K@^@èÅûÿÿ<8b>uØ<8b>MÜ<8b>Uô<8b>EøA<89>ð<89>Æ¿ ^K@^@¸^@^@^@^@èµûÿÿ<90>ÉÃUH<89>å<89>}ü<89>uø<90>]Ã^O^_D^@^@AWAVA<89>ÿAU
^@^@^@You asked the magic eightball:
%s
^@^@^@^@^@
The magic eightball answers: ^@Yes^@No^@Maybe^@Better not tell you^@My sources say yes^@Definitely not^@Better just kill yourself fag^@dude weed lmao^@smh tbh fam^@install gentoo^@Fizz^@%s%s^@Buzz^@
^@%d
^@%s^@^@^@^@^@^@^@


----------------------------------------

^@^@^@There were %d Fizz's and %d Buzz's in the range of numbers %d to %d.
^@^@^@^A^[^C;L^@^@^@^H^@^@^@hùÿÿ<98>^@^@^@^Xúÿÿh^@^@^@^NûÿÿÀ^@^@^@<8b>ûÿÿà^@^@^@püÿÿ^@^A^@^@æýÿÿ ^A^@^@øýÿÿ@^A^@^@hþÿÿ<88>^A^@^@^@^@^@^@^T^@^@^@^@^@^@^@^AzR^@^Ax^P^A^[^L^G^H<90>^A^G^P^T^@^@^@^\^@^@^@¨ùÿÿ*^@^@^@^@^@^@^@^@^@^@^@^T^@^@^@^@^@^
@^@^@^@^@^@^Y^@^@^@^@^@^@^@<90>^M`^@^@^@^@^@^[^@^@^@^@^@^@^@^H^@^@^@^@^@^@^@^Z^@^@^@^@^@^@^@<98>^M`^@^@^@^@^@^\^@^@^@^@^@^@^@^H^@^@^@^@^@^@^@õþÿo^@^@^@^@`^B@^@^@^@^@^@^E^@^@^@^@^@^@^@<88>^C@^@^@^@^@^@^F^@^@^@^@^@^@^@<80>^B@^@^@^@^@^@
^@^@^@^@^@^@^@^^@^@^@^@^@^@^@^K^@^@^@^@^@^@^@^X^@^@^@^@^@^@^@^U^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^@^@^@^@^@^@<80>^O`^@^@^@^@^@^B^@^@^@^@^@^@^@ð^@^@^@^@^@^@^@^T^@^@^@^@^@^@^@^G^@^@^@^@^@^@^@^W^@^@^@^@^@^@^@8^D@^@^@^@^@^@^G^@^@^@^@^@^@^@ ^D@
^@8^D@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^K^@(^E@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^L^@P^E@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^M^@^@^F@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^N^@T
@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^O^@`
@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^P^@è^K@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^Q^@8^L@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^R^@<90>^M`^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^S^@<98>^M`^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^C^@^T^@ ^M`^@
@^@^@^@^@^@^B^@^@^@^@^@^@^@@^A^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@R^A^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@n^A^@^@ ^@^X^@è^O`^@^@^@^@^@^@^@^@^@^@^@^@^@y^A^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@<8b>^A^@^@^P^@^X^@ø^O`^@^@^@
@^@^@^@^@^@^@^@^@^@^@^@^@^@<98>^A^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@¬^A^@^@^R^@^M^@Î @^@^@^@^@^@^M^@^@^@^@^@^@^@¶^A^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@Õ^A^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@è^A^@^@^R^@^@^@^@^@^@^
@^@^@^@^@^@^D^@^@^@^@^@^@^@4^B^@^@^R^@^M^@X^H@^@^@^@^@^@v^A^@^@^@^@^@^@=^B^@^@^R^@^M^@à @^@^@^@^@^@e^@^@^@^@^@^@^@M^B^@^@^R^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@a^B^@^@^P^@^Y^@^@^P`^@^@^@^@^@^@^@^@^@^@^@^@^@f^B^@^@^R^@^M^@^@^F@^@^@^@^@^@*^
@^@^@^@^@^@T
^@^@^@^@^@^@ ^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^D^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@ ^@^@^@^A^@^@^@^B^@^@^@^@^@^@^@`
@^@^@^@^@^@`
^@^@^@^@^@^@<86>^A^@^@^@^@^@^@^@^@^@^@^@^@^@^@^H^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@¨^@^@^@^A^@^@^@^B^@^@^@^@^@^@^@è^K@^@^@^@^@^@è^K^@^@^@^@^@^@L^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@^D^@^@^@^@^@^@^@^@^@^@^@^@^@^@^@¶^@^@^@^A^@^@^@^B^@^@^@^@^@^@^@8^L@^@^


 No.519099

>>519082

wait, i'm retarded


 No.519101

>>519099

kek, nice try though.


 No.519106

File: 1455208493440.jpg (48.94 KB, 415x484, 415:484, Chen Honking At You.jpg)

>>519082

Are you okay?


 No.519135

>>510268

I thought /tech/ was excused of disgusting weeaboos but I guess I was wrong

Here's my shitty script to install stuff from aur:


#!/bin/sh
BD=~/.builds
DD=/tmp

if [ "$#" -eq 0 ]; then
echo "No arguments supplied"
exit
fi

AURL=$@
PKGP=$DD/$(echo $AURL | sed 's/.*\///')
curl $AURL -o $PKGP
DIRS=$(tar -tzf $PKGP | sed -e 's@/.*@@' | uniq)

tar -xzf $PKGP -C $BD
cd $BD/$DIRS
makepkg -si


 No.519147

>>519135

It's good practice to pipe things through sort before uniq.


 No.519438

Something to easily use VS on images. Any idea of how I could improve it? I feel there's a lot of redundancy.


#!/bin/bash


nnedi3="
import edi_rpow2 as edi
src = core.fmtc.bitdepth(src, bits=16)
src = edi.nnedi3_rpow2(src, @)"

bm3d="
src = core.fmtc.matrix(src, mat=\"601\", col_fam=vs.YUV)
src = core.fmtc.bitdepth(src, bits=16)
ref = core.bm3d.Basic(src, sigma=[@,@,@])
src = core.bm3d.Final(src, ref, sigma=[@,@,@])"

dehalo="
import havsfunc as haf
src = core.fmtc.bitdepth(src, bits=16)
src = haf.DeHalo_alpha(src)"

deband="
src = core.fmtc.bitdepth(src, bits=16)
src = core.f3kdb.Deband(src, range=@)"

awarpsharp2="
src = core.fmtc.matrix(src, mat=\"601\", col_fam=vs.YUV)
src = core.fmtc.bitdepth(src, bits=8)
src = core.warp.AWarpSharp2(src, depth=@)"

param()
{
echo "$1" | sed 's/.*(\(.*\)).*/\1/'
}

[ "$#" != 3 ] && cat << EOF && exit 1
Usage: $0 "filterchain" input output(.png)
Example: $0 "nnedi3(2) bm3d(3) deband" in.png out.png
EOF

for i in $1
do
case "$i" in
nnedi3*)
script+="$(sed "s/@/$(param "$i")/g" <<< "$nnedi3")"
;;
awarpsharp2*)
script+="$(sed "s/@/$(param "$i")/g" <<< "$awarpsharp2")"
;;
bm3d*)
script+="$(sed "s/@/$(param "$i")/g" <<< "$bm3d")"
;;
deband*)
script+="$(sed "s/@/$(param "$i")/g" <<< "$deband")"
;;
dehalo)
script+="$dehalo"
;;
*)
echo "Unknown filter: $(cut -d'(' -f1 <<< $i)"
exit 1
;;
esac
done

vspipe <(
cat << EOF
import vapoursynth as vs
core = vs.get_core()

src = core.imwrif.Read("$(pwd)/$2")
$script
src = core.fmtc.matrix(src, mat="601", col_fam=vs.RGB)
src = core.fmtc.bitdepth(src, bits=8)
src = core.imwrif.Write(src, "PNG", "/tmp/out%d.png")

src.set_output()
EOF
) /dev/null

mv /tmp/out0.png "$(pwd)/$3"


 No.519648

>>509246

I've actually run


chmod -R 555 /

I meant to run


chmod -R 555 .

Had a fun chat with tech support trying to figure out if my system could be saved. It couldn't, we just reimaged it.


 No.522247

I'm sure the problem here is obvious, but I can't find it. (MUSIC_PLAYER isn't getting set, and I don't know why).


# Music
mplay() {
if [ -z "$MUSIC_PLAYER" ]; then
find -L $(xdg-user-dir MUSIC) -print0 | shuf -z | \
exec xargs -0 mpv &>/dev/null &
MUSIC_PLAYER=$(pgrep -P $!)
disown
else
kill -SIGCONT $MUSIC_PLAYER
fi
echo $MUSIC_PLAYER = $(pgrep -P $!) \; $!
}
mstop() {
kill $MUSIC_PLAYER
unset PLAYER
}
alias mpause='kill -SIGSTOP $MUSIC_PLAYER'


 No.522754

I use this script

>https://clbin.com/iIaV2

for opening new terminals.

Yes, I'm serious.


 No.523250

>>509232

This is a bash script for downloading files from a given 8chan thread. I originally made it to make downloading doujins/comics easier, but added more stuff over time.

https://ghostbin.com/paste/pm49h

It supports downloading from 8chan's onion site (requires torsocks), downloading files numerically (1.jpg, 2.jpg, and so on), keeping the files' original upload names, and downloading files within ranges using a beginning and/or ending filename from the thread. You can call it with arguments, or just run it and fill in the prompts.

For the range, a beginning file only means it starts there and downloads the rest of the thread. For an ending file only, it starts at the beginning and ends after that file. You must use the filename(s) used in the post, not the post number it belongs to. It's better that way because you may not want all files from a post. and mostly just because it was easier to use a filename reference than a post reference To make it easier, the script looks for images containing the string you enter instead of explicitly matching it. I could add searching by post numbers if people like my script though.

I'm not experienced with bash (or programming) so the code is a mess, but it *usually* does what it's supposed to. Unfortunately, I'm sill finding new bugs.

I uploaded this a few hours ago, but took it down shortly after I ran into a problem with some threads using "media.8ch.net" for file URLs and others using just "8ch.net". My script failed for threads using the latter because I never tested for it or even knew about this. I also had to find a workaround for downloading files using their original upload names if they contained spaces because of how my script calls wget. I did fix them, but now I'm wondering what else will come up.


 No.523265

Scripts at work so gonna write this from my phone.

subtract.sh

cat $1 $2 $2 | sort | uniq -u

Basically just "subtracts" any matching lines from file1 if it exists in file2, but doesn't add unique lines from file2. Stupid and simple, but good for comparing data dumps.


 No.523889

This is a command to determine how bloated a distro is.

 find / *


 No.523983

>>512364

why proliferating bash makes you pure cancer: the post


 No.524010

>>509232

not exactly bash, but this picks a random line from a stream


#!/usr/bin/perl -w
rand($.) < 1 && ($pick = $_) while <>;
print $pick;


 No.524124

>>524010

shuf -n 1


 No.526651

Is it possible to submit posts to 8ch from a bash script?

It would make posting less of a headache with text-base browsers, to have tmux pane opened alongside elinks where I could just type in thread number/post number replies/etc.

I try to rely on gui's as little as possible.

Urls can be sent to external programs with elinks, which is handy. Could probably create an array from urls piped to a file in /tmp and watch for replies and auto-fill quote links/thread info.


 No.526680

>>526651

>Is it possible to submit posts to 8ch from a bash script?

Yes. Don't you remember why the global captcha was implemented?


 No.526713

>>526680

I thought that was because of that free vpn service being used as a botnet, and it had so many IPs that they could never all be banned.

I have to use Links compiled with graphical support so I can submit the solved captcha.

Also, is it just me, or does it seem like the global captcha expires every 12 hours, not ever 24 hours?


 No.526759

File: 1455945301685.jpeg (6.51 KB, 300x199, 300:199, hotdogcat31.jpeg)

>>510889

It's a useful lesson, because by taking the path of least resistance, we open ourselves and the ones we love up to more and more destruction.

On one side, you have the example here of blindly trusting a random tool from a forum.

On the other side, you have the more popular example of blindly trusting a corporation's tool.

The first example is objectively less destructive because even if you didn't care enough about yourself to examine the tool at first, you have the opportunity to do so later.

With the second option, you have no such option, and so the theoretical potential for exploitation is infinite.

I would agree that to act this way is to shirk personal responsibility and perform a disservice to all involved. It's simply a result of confusion.


 No.527087

>>519147

>It's good practice to pipe things through sort before uniq.

No, it's outright necessary unless every duplicate line is going to appear next to each other.

And in this case every line is going to be duplicate (and it won't work if that's not the case) so uniq is the wrong tool for the job.

>>519135

* Poorly chosen error codes

* Errors on stdout

* Uppercase variables

* Misunderstanding of $@

* Broken use of echo

* Non-quoted variables

* Uniq is the wrong way to go about getting what you want there

* Unnecessary use of sed BOTH TIMES

* Fails if the package was built before

* Doesn't bail out on "fatal" errors (cd failure)

* Doesn't ensure ~/.builds exists

* Doesn't use the AUR's git repository.

* Doesn't fix short urls (you still have to copy the full url)

* Use "--" where appropriate to ensure things don't end up getting passed as switches

* Very poor /tmp cleanup behaviour and a potential symlink race attack

Let's fix the script.

* Do not exit with success if the command fails

@@ -4,7 +4,7 @@

if [ "$#" -eq 0 ]; then
echo "No arguments supplied"
- exit
+ exit 1
fi

AURL=$@
* Send errors to stderr
@@ -3,7 +3,7 @@
DD=/tmp

if [ "$#" -eq 0 ]; then
- echo "No arguments supplied"
+ echo >&2 "No arguments supplied"
exit 1
fi
* Lowercase variable names (by convention, uppercase are for environment variables, so using uppercase could have unintended side effects)
@@ -1,17 +1,17 @@
#!/bin/sh
-BD=~/.builds
-DD=/tmp
+bd=~/.builds
+dd=/tmp

if [ "$#" -eq 0 ]; then
echo >&2 "No arguments supplied"
exit 1
fi

-AURL=$@
-PKGP=$DD/$(echo $AURL | sed 's/.*\///')
-curl $AURL -o $PKGP
-DIRS=$(tar -tzf $PKGP | sed -e 's@/.*@@' | uniq)
+aurl=$@
+pkgp=$dd/$(echo $aurl | sed 's/.*\///')
+curl $aurl -o $pkgp
+dirs=$(tar -tzf $pkgp | sed -e 's@/.*@@' | uniq)

-tar -xzf $PKGP -C $BD
-cd $BD/$DIRS
+tar -xzf $pkgp -C $bd
+cd $bd/$dirs
makepkg -si
* Fix usage of $@, replace with a more sensible loop
@@ -7,11 +7,12 @@
exit 1
fi

-aurl=$@
-pkgp=$dd/$(echo $aurl | sed 's/.*\///')
-curl $aurl -o $pkgp
-dirs=$(tar -tzf $pkgp | sed -e 's@/.*@@' | uniq)
+for aurl do
+ pkgp=$dd/$(echo $aurl | sed 's/.*\///')
+ curl $aurl -o $pkgp
+ dirs=$(tar -tzf $pkgp | sed -e 's@/.*@@' | uniq)

-tar -xzf $pkgp -C $bd
-cd $bd/$dirs
-makepkg -si
+ tar -xzf $pkgp -C $bd
+ cd $bd/$dirs
+ makepkg -si
+done
* Do not use echo on variables (because it has surprising behaviours)
@@ -8,7 +8,7 @@
fi

for aurl do
- pkgp=$dd/$(echo $aurl | sed 's/.*\///')
+ pkgp=$dd/$(printf %s\\n "$aurl" | sed 's/.*\///')
curl $aurl -o $pkgp
dirs=$(tar -tzf $pkgp | sed -e 's@/.*@@' | uniq)

* Quote all variables unless you intend to split them and glob them
@@ -9,10 +9,10 @@

for aurl do
pkgp=$dd/$(printf %s\\n "$aurl" | sed 's/.*\///')
- curl $aurl -o $pkgp
- dirs=$(tar -tzf $pkgp | sed -e 's@/.*@@' | uniq)
+ curl "$aurl" -o "$pkgp"
+ dirs=$(tar -tzf "$pkgp" | sed -e 's@/.*@@' | uniq)

- tar -xzf $pkgp -C $bd
- cd $bd/$dirs
+ tar -xzf "$pkgp" -C "$bd"
+ cd "$bd/$dirs"
makepkg -si
done
* Don't use uniq just to get the first value somewhere
@@ -10,7 +10,7 @@
for aurl do
pkgp=$dd/$(printf %s\\n "$aurl" | sed 's/.*\///')
curl "$aurl" -o "$pkgp"
- dirs=$(tar -tzf "$pkgp" | sed -e 's@/.*@@' | uniq)
+ dirs=$(tar -tzf "$pkgp" | sed -e 's@/.*@@' | head -1)

tar -xzf "$pkgp" -C "$bd"
cd "$bd/$dirs"
* Don't use sed for basic edits to single strings
@@ -8,9 +8,9 @@
fi

for aurl do
- pkgp=$dd/$(printf %s\\n "$aurl" | sed 's/.*\///')
+ pkgp=$dd/${aurl##*/}
curl "$aurl" -o "$pkgp"
- dirs=$(tar -tzf "$pkgp" | sed -e 's@/.*@@' | head -1)
+ dirs=$(tar -tzf "$pkgp" | { IFS=/ read -r path etc; printf %s\\n "$path"; })

tar -xzf "$pkgp" -C "$bd"
cd "$bd/$dirs"
* Overwrite old package, do a clean build
@@ -14,5 +14,5 @@

tar -xzf "$pkgp" -C "$bd"
cd "$bd/$dirs"
- makepkg -si
+ makepkg -sifCc
done


 No.527088

* Bail out on fatal errors

@@ -9,10 +9,10 @@

for aurl do
pkgp=$dd/${aurl##*/}
- curl "$aurl" -o "$pkgp"
+ curl "$aurl" -o "$pkgp" || exit
dirs=$(tar -tzf "$pkgp" | { IFS=/ read -r path etc; printf %s\\n "$path"; })

- tar -xzf "$pkgp" -C "$bd"
- cd "$bd/$dirs"
- makepkg -sifCc
+ tar -xzf "$pkgp" -C "$bd" || exit
+ cd "$bd/$dirs" || exit
+ makepkg -sifCc || exit
done
* Make sure ~/.builds is there
@@ -7,6 +7,8 @@
exit 1
fi

+mkdir -p "$bd" || exit
+
for aurl do
pkgp=$dd/${aurl##*/}
curl "$aurl" -o "$pkgp" || exit
* Use the git repo instead of downloading the tar (this also fixes the /tmp problem)
@@ -1,6 +1,5 @@
#!/bin/sh
bd=~/.builds
-dd=/tmp

if [ "$#" -eq 0 ]; then
echo >&2 "No arguments supplied"
@@ -10,11 +9,16 @@
mkdir -p "$bd" || exit

for aurl do
- pkgp=$dd/${aurl##*/}
- curl "$aurl" -o "$pkgp" || exit
- dirs=$(tar -tzf "$pkgp" | { IFS=/ read -r path etc; printf %s\\n "$path"; })
+ cd "$bd" || exit
+ pkgname=${aurl##*/}; pkgname=${pkgname%.git}
+
+ if [ -d "$pkgname" ]; then
+ cd "$pkgname" || exit
+ git reset --hard && git pull --ff-only || exit
+ else
+ git clone "$aurl" || exit
+ cd "$pkgname" || exit
+ fi

- tar -xzf "$pkgp" -C "$bd" || exit
- cd "$bd/$dirs" || exit
makepkg -sifCc || exit
done
* Let the user type the package name, rather than the full URL
@@ -8,9 +8,9 @@

mkdir -p "$bd" || exit

-for aurl do
+for pkgname do
cd "$bd" || exit
- pkgname=${aurl##*/}; pkgname=${pkgname%.git}
+ aurl=https://aur.archlinux.org/$pkgname.git

if [ -d "$pkgname" ]; then
cd "$pkgname" || exit
* Use "--" where appropriate
@@ -6,18 +6,18 @@
exit 1
fi

-mkdir -p "$bd" || exit
+mkdir -p -- "$bd" || exit

for pkgname do
- cd "$bd" || exit
+ cd -- "$bd" || exit
aurl=https://aur.archlinux.org/$pkgname.git

if [ -d "$pkgname" ]; then
- cd "$pkgname" || exit
+ cd -- "$pkgname" || exit
git reset --hard && git pull --ff-only || exit
else
- git clone "$aurl" || exit
- cd "$pkgname" || exit
+ git clone -- "$aurl" || exit
+ cd -- "$pkgname" || exit
fi

makepkg -sifCc || exit

Final script:

#!/bin/sh
bd=~/.builds

if [ "$#" -eq 0 ]; then
echo >&2 "No arguments supplied"
exit 1
fi

mkdir -p -- "$bd" || exit

for pkgname do
cd -- "$bd" || exit
aurl=https://aur.archlinux.org/$pkgname.git

if [ -d "$pkgname" ]; then
cd -- "$pkgname" || exit
git reset --hard && git pull --ff-only || exit
else
git clone -- "$aurl" || exit
cd -- "$pkgname" || exit
fi

makepkg -sifCc || exit
done


 No.527102

>>519438

Is there a way to pass arguments to vspipe? I feel like you could do things much more easily if you moved all the logic into the python script down there.

>>522247

That depends on your shell not creating subshells on the right of pipes (which is contrary to the behaviour of most shells, including bash).

TIMTOWTDI but what I'd do is I'd invoke mpv the option 「--input-unix-socket="$HOME/.mpv-control"」. Then you can control it with socat:

echo cycle pause | socat - unix-connect:"$HOME/.mpv-control"
sleep 1
echo cycle pause | socat - unix-connect:"$HOME/.mpv-control"
sleep 1
echo stop | socat - unix-connect:"$HOME/.mpv-control"

Another way would be to get the file list into an array ($@ if you're using sh), and pass on to mpv without using a pipeline. But, getting that array in the first place is difficult.

>>523265

You need the 'comm' command in your life.

>>523889

df -h /

>>523983

It's just a difficult language to learn. Compared with newer languages, it's got an insane learning curve and a lot of pitfalls. It's still possible to write good code in it, and often that code runs faster and more efficiently than code in higher level languages (due to extremely well implemented tools like sort, grep, etc, as well as the automatic concurrency gained through pipes).


 No.527105

>>522247

>>527102

One more thing, `man mpv`, look for "List of Input Commands". That will give you a list of the commands you can pass (through socat) to mpv.


 No.527507

Some imagefap scrapper I made right now (because my galleries-to-download backlog is getting huge):


#!/bin/sh

USER_AGENT="User-Agent: Mozilla/5.0 (Windows NT 6.1; rv:38.0) Gecko/20100101 Firefox/38.0"
GALLERY_URL="$(cut -d'?' -f1 <<< "$1")"
PAGE_URL="$GALLERY_URL"
OUTDIR="$(basename "$GALLERY_URL")"

mkdir "$OUTDIR"
cd "$OUTDIR"

while [ ! -z "$PAGE_URL" ]
do
curl -s -- "$PAGE_URL" | grep '/photo' | sed 's|.*href="\(.*\)">.*|http://www.imagefap.com\1|' | while read i
do
URL="$(curl -s -- "$i" | grep -F '"contentUrl":' | sed 's|.*"contentUrl": "\(.*\)",.*|\1|')"
curl -O --header "$USER_AGENT" -- "$URL"
done
PAGE_URL="$(curl -s -- "$PAGE_URL" | grep -F ':: next ::' | head -1 | sed 's|.*href="\(.*\)">.*|'"$GALLERY_URL"'\1|' | sed 's/amp;//g')"
done


 No.527512

>>527507

I need to learn how to scrappers ASAP


 No.527513

>>527512

There's nothing really to learn, just inspect the html and use your brain.


 No.528282

Gets the weather for you:

#!/bin/bash
curl -4 http://wttr.in/@$(curl -s 'https://api.ipify.org?format=plaintext')


 No.528287

>>528282

You can just remove the "?format=plaintext", it's the default anyways.


 No.528399

>>526651

I found a handy utility for getting forms from web pages.

https://journalxtra.com/linux/bash-linuxsanity/bash-filling-web-forms-with-curl-and-wget/

and some information about how to use curl or wget to submit form data.

Gonna set up elinks so URIs are processed by a script that separates youtube links from 8ch threads, since I open embeds with youtube-dl.

The post-helper script will use dialog for the UI, and will watch the tmp file for 8ch urls (namely post links), and auto-fill it's form data to have the corresponding board and thread number. The input for the text field will have to be run through sed to replace characters like space/newline/& with corresponding UTF8 hex values for curl to send it correctly.

Kind of excited about this, it's been too much of a hassle to post via elinks, I'll post what I come up with once I get it working.

Only thing I'd be missing is the thread updater because no javascript.


 No.528403

>>519082

As someone who does not know C, I am very confused.


 No.528404

>>528403

That's not C. It's a native Linux executable but he's run it through some filter like cat -v or something.


 No.528510

>>527102

>You need the 'comm' command in your life.

That look damn useful, but the unix distro at work doesn't even support extended regular expressions. I usually have to find extremely low-tech ways to do stuff. It might have it though. I'll check, but I'm still learning unix and linux tbh.

Thanks for the heads up.


 No.528524

>>510884

If you take bleach and ammonia into a dark closet and and stir them together for a while, you'll get cool faintly glowing crystals that form.


 No.528542

>>527088

Holy shit! Thanks for the tips. This was meant for my personal use only so I took a lot of assumptions and basically compiled the original script by searching and copy pasting from the internet.


 No.528565

>>528399

Successfully managed to upload a webm and text using a proof of concept script.

When I was trying to condense the post form into one chunk for curl, it kept fucking up for reasons I don't totally understand.

Finally after breaking up the input to curl, the data posted, which means I wont have to convert spaces into %20, but line breaks I'm not sure.

If the data goes through with no error, there's no response received from the server, on error it creates the web page response. So when the server fails to upload a webm like it usually does, I could have it retry the post if the response file is not null and append to the text body how many times the post failed to go through.

Maybe by tomorrow I'll have the dialog UI done.


 No.528583

>>528510

comm is part of POSIX and not complicated, so there's a good chance it's available.


 No.528795

#!/bin/sh
password="LICK MY RECTUM"
userAgent="Mozilla/5.0 (Windows NT 6.1; WOW64; rv:38.0) Gecko/20100101 Firefox/38.0"
[[ "$1" =~ ^https://8ch.net/.*/res/.*$ ]] && url="$1" || (echo "Not a valid url" && exit)
board=$(cut -d/ -f4 <<< "$url")
threadNum=$(cut -d/ -f6 <<< "$url" | cut -d. -f1)
echo "thread: $url"
read -p email: email
read -e -d "(TAB GOES HERE SO YOU CAN HIT ENTER AND NOT STOP READ FROM READING INPUT, OR MAYBE SOME OTHER HARDLY USED CHARACTER BUT TABS ARE GOOD BECAUSE IF YOU PRESS TAB IN THE COMMENT FIELD ON THE SITE IT CHANGES FOCUS OFF THE COMMENT FIELD)" -p comment: comment
comment="$(sed 's/\r/\n/g' <<< "$comment")"
while true; do
read -p file: file
file=${file#file://}
[ ! -e "$file" ] && echo "Not a file, pick a new one" && file= && continue
fileName=$(basename "$file")
break
done
curl 'https://8ch.net/post.php' \
-H 'Host: 8ch.net' \
-H "User-Agent: $userAgent" \
-H 'Accept: */*' \
-H 'Accept-Language: en-US,en;q=0.5' \
--compressed \
-H 'DNT: 1' \
-H 'X-Requested-With: XMLHttpRequest' \
-H "Referer: $url" \
-H 'Connection: keep-alive' \
-H 'Pragma: no-cache' \
-H 'Cache-Control: no-cache' \
-F thread="$threadNum" \
-F board="$board" \
-F email="$email" \
-F body="$comment" \
-F no_country="on" \
-F "file=@\"$file\"; filename=$fileName" \
-F password="$password" \
-F json_response="0" \
-F post="New Reply"

worked for me to post in a thread

very simple rudimentary though

I imagine one could get one working with board specific options if they have a directory where the json board settings (https://8ch.net/settings.php?board=tech) are stored and to read from there when asking for user input

but it doesn't say in that settings page how many image files are allowed per post..


 No.528923

Do curl and wget fulfil the same niche? In what situations would you use one over the other?


 No.528941

>>528923

Wget is useful for recursively downloading entire directories or sites. Curl is useful for sending arbitrary requests (e.g. with post data).


 No.528946

>>528923

Wget is useful for recursively downloading entire directories or sites. Curl is useful for sending arbitrary requests (e.g. with post data).


 No.528947

>>528923

curl is for transmitting data over a connection, wget is for downloading things. They overlap, but curl supports more obscure TLS variants, and wget understands how pages work so it can, for example, fetch page requisites or entire websites.

This is a "script" that can download all files posted in an 8chan thread, with a URL to the thread as the first argument:

#!/bin/sh
wget "$1" -e robots=off -r -nd -I '*/src' -R '*.html' -H

You wouldn't be able to do that with curl without involving half a dozen other utilities.


 No.528948

>>528946

Woops, had a cloudflare error, didn't know it post'd


 No.528961

>>528795

That is similar to what I've got written up so far, only I've left out most of that information you specify, I've only got what is being posted like board, thread, body, file, post.


 No.528974

>>528947

that's useful, thank you.


 No.528984

for when niggers post youtube embeds in webm threads. I have this mapped to ctrl+alt+w. Takes a youtube video in your clipboard and plays it in vlc so you don't have to deal with jewscript media players

[code]

#!/bin/bash

#I have no clue what I'm doing, but it works for me

media=$(xclip -o -selection clipboard)

nohup vlc $media >/dev/null & disown

exit

[code]


 No.528986

>>528984

I also don't know how to put code onto this imageboard


 No.528988

>>528984

You could use this instead:

#!/bin/sh
exec vlc "$(xclip -o -sel c)"

>>528986

You use [/code ] (without the space) as a closing tag.


 No.528997

File: 1456180270556.webm (2.58 MB, 202x360, 101:180, Foodstamps.webm)

>>528988

It's a bit annoying that the F.A.Q. just says you can use the [code] tag.


 No.529144

>>528997

I don't know what system you're thinking of that has word-based tags and no differentiating close-tag

ofc, you're right in the sense that its entirely unnecessary to operate this way, since the only reason for it is recursive nesting, which none of the available formatters actually support anyways.


 No.529145

>>527507

it's literally just a matter of finding thing you want

and then splitting it up so you only get that thing


 No.529158

>a while back i found a handy af script to output mic-in to headphone-out

>could run botnet skype or a phone call on my phone or listen to music on my mp3 player and hear it on the same headphones as my computer

>last time i used it was 2 computers ago

>);

anyone have anything like this still? i use alsa btw, tho i probably used ass tier pulseaudio at the time


 No.529159

I just saw a really strange way of implementing an if test in my Chromium OS .bash_profile

Instead of using

if [[ -f ~/.bashrc ]]
. ~/.bashrc
fi

it used

[[ -f ~/.bashrc ]] && . ~/.bashrc

Obviously, they both work and the second one is shorter, but what are some of the advantages and disadvantages of the two approaches?


 No.529162

>>529159

*

if [[ -f ~/.bashrc ]] then

I'm a fucking moron


 No.529163

>>529162

if [[ -f ~/.bashrc ]]; then

goddammit


 No.529167

>>529159

There's no real "advantages"/"disadvantages" to either in this case.

There are differences though:

- 'if' (without else) will set $? to 0 if the condition evaluates to false, while '&&' will result in $? carrying the exit code of the failed condition.

- '&&' can't have an else other than with that '||' hack, and even then you have to be careful to ensure your first branch always returns success.

- '&&' needs command grouping if you ever want to add another command in there without chaining '&&'s. '[ -f file ] && { echo Ayy it exists; . ~/.bashrc; }'

- '&&' is terser

In this case I would use '&&' since it's just a simple condition and there's no reason to waste 3 lines on "read bashrc if it exists".

Though, if the user is expected to have a bashrc, I'd drop the check entirely. The user is then warned of their missing bashrc if they don't have one, and they can create it as an empty file to silence the warning.


 No.529171

>>529167

That's pretty much exactly what I was wondering, thanks.


 No.529368

>>529159

>bash test syntax

Your script is shit if it can't work with sh.


 No.529402

sudo curl https://clbin.com/aHy1F -o /usr/share/cowsay/cows/loss.cow && fortune | cowsay -f loss


 No.529510

>>529368

Look at what file it's sourcing. On that basis I'd give it a pass.


 No.529579

>>529510

Sure, if it's a bash_profile, I understand.


 No.529873

>>528984

>>528988

How would I do this if I have gpm instead of xclip?


 No.529986

>>529873

The best way I can think of is replacing the xclip command by "head -n 1", running the full command, pasting the URL, and then pressing enter.


 No.529989

>>529368

sh is a shit shell and you should feel bad for using it. Bash may be a pile of garbage code but, for fuck's sake, at least it does arrays.


 No.530012

>>529989

A shit shell but a good script executor. Because of its simplicity there's less it can screw up.


 No.530020

>>529989

>ANSI C is shit, you should code everything in C11


 No.530131

>>529989

sh is more common and usually faster.


 No.530286

File: 1456340334726.png (11.26 KB, 936x402, 156:67, pacman 2016-02-24 19:56:40.png)


 No.530423

In trying to set up my dialog UI more elegantly, it continuously fucks up. Having to declare the same shit for every window function is going to be a pain in my ass, I don't know if just having one basic window in the background constantly and just opening new windows on top of it will work.


#!/bin/bash

#this doesn't work
function win_title(){
dialog --clear --backtitle "never changes"
}
function main_win(){
win_title \
--title "changes per window" \
--menu "text" 15 15 15 \
options "description" \
etc "etc"
}
#this does work
function main_win(){
dialog --clear --backtitle "never changes" \
--title "changes per window" \
--menu "text" 15 15 15 \
options "description" \
etc "etc"
}


 No.530439

>>530423

Put "$@" (including quotes) at the end of your command inside win_title

$@ is the array of all arguments, and if you surround it in doublequotes, it expands to all the arguments exactly as passed.


 No.530444

Shitty script to grab the last post ID on a board:

#!/bin/bash
curl "https://8ch.net/$1/res/$(
curl https://8ch.net/$1/threads.json 2>/dev/null | \
json_pp -json_opt pretty,canonical,utf8 | \
sed -n '/"no"/ { s/\s*"no"\s*:\s*//p; q }'
).json" 2>/dev/null | \
json_pp -json_opt pretty,canonical,utf8 | \
tac | \
sed -n '/"no"/ { s/\s*"no"\s*:\s*//p; q }'

json_pp is from some perl 5 thing but anything that pretty-prints one key-value pair per line will work as a substitute


 No.530456

>>530439

Well I'll be damned, thanks for that.

That will help reduce the bloat a little.


 No.530587

>>530444

Since you're using perl's json_pp, you might as well move the logic (from sed) into perl too:

#!/bin/sh
top_thread=$(curl -sSL https://8ch.net/$1/threads.json |
perl -MJSON::PP -e '$/="";print decode_json(<>)->[0]{threads}[0]{no}') || exit
curl -sSL "https://8ch.net/$1/res/$top_thread.json" |
perl -MJSON::PP -e'$/="";print decode_json(<>)->{posts}[-1]{no}."\n"'

I've changed some other things:

- #!/bin/sh because no bashisms are used

- Made a "top_thread" variable to reduce the clutter in the one line

- Removed the backslash after pipe (not necessary)

- Changed the "2>/dev/null" to "-sSL" (my usual curl options; silent, Show errors, foLlow redirects).

- Exit if we don't get a top thread (i.e. if json parsing fails)

I'm just about to head to bed so I haven't fixed the most glaring bug in the script: this picks locked threads and stickies which skews the result a lot. It currently thinks /tech/ is on post 132865.


 No.530589

As an aside, the way I would usually parse that json is to install the jq tool and run it:

jq -r '.[0].threads[0].no'

jq -r '.posts[-1].no'


 No.530979

# Wrap scp(1) to check for missing colons
scp() {
local argstring
argstring=$*
if (($# >= 2)) && [[ $argstring != *:* ]] ; then
printf 'bash: %s: Missing colon, probably an error\n' \
"$FUNCNAME" >&2
return 2
fi
command scp "$@"
}


 No.530981

# Count arguments
ca() {
printf '%d\n' "$#"
}


 No.531011

>>530979

Why do you assign the value of $* to $argstring instead of just using $* everywhere you now use $argstring?


 No.531054

I've been abusing shell features to make functional programming possible.

The first thing that's obviously missing is anonymous functions (lambdas).

lambda () {
lambda_expand $@
}

lambda_pipe () {
lambda_expand $(cat)
}

lambda_expand () {
echo "lambda_run $# $*"
}

lambda_run () {
local fun=
local fun_len=$1
shift
while [ "$fun_len" -gt 0 ]; do
fun="$fun $1"
shift
fun_len=$((fun_len - 1))
done
eval "lambda_temp () { $fun; }; lambda_temp $*"
}

Usage:

$ $(lambda 'echo $#') a b c
3
$ ca=$(lambda 'echo $#')
$ $ca a b c d
4
$ ca=$(lambda_pipe << EOF
echo \$#
EOF
)
$ $ca a b c d e
5

It's not strictly POSIX, because of local, but it works in ash and mksh, so it should work almost everywhere.

Inferno's sh (which is like Plan 9 rc, not like Bourne sh) supports something like this natively, with another syntax.


 No.531057

>>531054

Next, you may want a map function, to apply a function to each argument in turn.

map () {
local fun=$1
shift
for x in "$@"; do
$fun "$x"
done
}

Usage:

$ map "$(lambda 'echo $(($1*$1))')" $(seq 5)
1
4
9
16
25


 No.531143

>>530587

Nice improvement, I was in a rush when I wrote the original because I used it to steal another board's get or else I would've put a little more effort in.


 No.531357

>>510884

cry some more you fucking faggot


 No.531359

the amount of crybaby faggots itt is amazing.

when did crying about meanies on chans become a thing?


 No.531430

>>531011

appeasing shellcheck(1), from memory


 No.531460

>>531054

This has a lot of problems with word splitting and globbing and such because you are careless with your arguments.

Just use eval.

foreach() {
local fun arg
fun="$1 \"\$arg\""
shift
for arg do eval "$fun"; done
}

foreach echo 'you think youre though uh' 'i have one word for you' 'the forced indentation of the code'
foreach 'count() { echo "${#1}";}; count' look at all these unquoted words, truly I\'m living on the edge

You pass code in as a string, and if you don't want to put a whole fucking program in there you just pass it a function name.


 No.531491

>>531460

There are issues with the code I posted (I fixed one of them by moving the calling of the lambda_temp function out of eval, to get eval "lambda_temp () { $fun; }"; lambda_temp "$@"), but your alternative no longer does what I want to do. With my version, you can pass the output of lambda around and mostly treat it as if it's just a command name, instead of a convoluted function definition.

My goal was to be able to do things like this, whether it's really useful or not:

$ count=$(lambda 'echo "${#1}"')
$ $count 'the function is now stored in the variable'
42


 No.531505

It'd be nice if somebody who knows Perl could create a Perl scripting thread.


 No.532213

>>531505

Why not start one? If you build it, they will come.


 No.532224

>>530444

>>530587

#!/bin/sh
top_thread=$(curl -sSL https://8ch.net/$1/catalog.json |
perl -MJSON::PP -e '$/="";print ((grep { !$_->{sticky} && !$_->{locked} } @{decode_json(<>)->[0]{threads}})[0]{no})') || exit
curl -sSL "https://8ch.net/$1/res/$top_thread.json" |
perl -MJSON::PP -e'$/="";print decode_json(<>)->{posts}[-1]{no}."\n"'

A drunken attempt to fix that bug


 No.532232

>>512239

GUISE! DON'T TEST THIS ! IT DESTROYED MY MANJARO!


 No.534432

File: 1456824599849.jpg (38.15 KB, 327x481, 327:481, E78B31C.jpg)


#!/bin/bash
echo -e "########################################## \n"
echo -e "HI ${USER} \n"
echo -e "########################################## \n"

sleep 5

scr=find /home/ | grep -i "scrot.png\|screenfetch\-"
target_dir=${HOME}/Pictures/screenshots/
if [ -d $target_dir ]; then
find /home/ | grep -i "scrot.png\|screenfetch\-" | xargs -I{} mv {} $target_dir
echo -e " Folder exists and Screenshots moved to ${target_dir} \n"
else
mkdir -p $target_dir
find /home/ | grep -i "scrot.png\|screenfetch\-" | xargs -I{} mv {} $target_dir
echo -e "Folder ${target_dir} created and Screenshots moved to ${target_dir} \n"
fi


 No.534452

>>534432

Thanks it's useful just for the variable usage alone as I forgot how it worked.


 No.534453

>>532232

>Arch

>Not knowing to put random input into their system.

Why am I not suprised?


 No.534478

>>534432

Did you read >>534205 ? Your src=... line isn't even syntaxically correct.


 No.534483

File: 1456833598550.png (39.06 KB, 659x448, 659:448, gnuuu.png)


 No.534487

File: 1456834427837.png (55.31 KB, 534x344, 267:172, 1456563025135.png)

>>534478

> Your src=... line isn't even syntaxically correct.

How? It's useless anyways, remove it.


if [ -f $scr ]; then

I thought of a better and forgot to remove it

>>grep when you could use -iname

I never use -iname because grep is more powerful when dealing with wildcards

>>no #/bin/sh

what's the difference between bash and sh?


 No.534489

>>534487

You need to quote the string when it's inside single brackets, I believe

if [ -f "$src" ]; then

or

if [[ -f $src ]]; then


 No.534491

>>534489

or


if [ -f "${src}" ]; then

>[[

why double brackets?


 No.534504

>>534491

It's a bash extension, supporting more comparisons (I think). It's not necessary for common tests like '-f'.


 No.534505

>>534489

You have to quote it either way, because this is sh and not a good shell.

[ is a program. Try it:

which [
/usr/bin/[

I won't presume to know why it exists but if I were to venture a guess I'd say it's because writing if test -f "$src"; then ... sucks royal shit for readability.

[[ is bash syntax and not a program.


 No.534507

Dave Taylor writes a shell script column at Linux Journal. He's also written a book called Wicked Cool Shell Scripts, though it's a bit dated. Kyle Rankin's LJ articles are also good, and Henry Grebler who wrote some pieces for Linux Gazette.


 No.534512

>>534487

It is syntactically correct. It just doesn't do what you think it does.

It tries to run the "/home/" directory as a command with the environment variable "scr=find". Then it pipes the output of that (nonexistent) command into a grep.

The end result is nothing changes but you get some errors thrown around.

>if [ -f $scr ]; then

quote

your

variables

>I never use -iname because grep is more powerful when dealing with wildcards

Also much slower, and the way you're using it, much more error prone.

>what's the difference between bash and sh?

kek

bash has a ton of features added on top of sh, like the double-left-bracket keyword and arrays. If you don't use them you should probably use sh, since many distros package a much faster shell as sh.

>>534491

>ugly useless curlies

why.

>>534504

It supports some more comparisons, but simply due to it being a bash extension I'd avoid it.

It also gives people the illusion that they don't need to quote their variables, but:

a=stuff
b=*
if [[ $a = $b ]]; then echo wrong; fi

As you can see, these two non-identical variables compare as identical when you don't quote with the [[ keyword. So I'd argue that the benefits you get from using [[ are confusing to the point that they're somewhat dangerous.

Just use [ and learn how to quote. For everything else there's case and maybe grep.

>>534505

>You have to quote it either way, because this is sh and not a good shell.

[[ is not in sh

>and not a good shell

:^)


 No.534518

>>534512

>>ugly useless curlies

>why.

For readability when concatenating


if [ -f "${HOME}/Downolads/txt.txt." ]; then


 No.534525

>>534512

>[[ is not in sh

>>[[ is bash syntax and not a program.

Yes, genius. Bash's main selling feature is still sh compatibility. If not for that we could all be using something worthwhile such as rc.

>>534518

Explain how that's more readable than

if [ -f "$HOME/Downolads/txt.txt." ]; then
or for that matter
if (test -f $home/Downolads/txt.txt.) {
and not just needless decoration.


 No.534531

>>534525

I thought that's the only way to concatenate variable and string. Good to know


 No.534533

>>534491

Double brackets recognize variables, not variable expansion.

If $arg equals "a b", then the shell is not allowed to see a difference between [ $arg = "a b" ] and [ a b = "a b" ]. If $arg equals "", then the shell is not allowed to see a difference between [ $arg = "example" ] and [ = "example" ]. Quoting arguments fixes that, but it's a common mistake. It's a design flaw in Unix. [ ] looks like a language feature, but it has to behave like an external program, with all of the resulting limitations. "test" is equivalent to [ (except that it doesn't need a closing ]), and clearly a command, so arguably better.

Double brackets also support bash extensions like substring matching. I think bash [ doesn't.

>>534525

These are bash's selling points:

- Full backward compatibility with sh, so all sh code works in bash

- Extensions that make scripting safer and less of a pain when used

- Being very widely available

I script in almost-pure sh when it's not much harder to do so, but using bash features is acceptable.

If bash's main selling point were the possibility of running bash code in sh it wouldn't be as popular.


 No.534534

>>534531

If you want to concatenate $HOME and "abc" for some reason, using ${HOME} is a good solution, but you don't need it when concatenating $HOME and "/whatever", because / can't be part of a variable name.

"$HOME"abc or "$HOME""abc" is another solution.


 No.534541

>>534533

Why do you insist on reading what I write backwards? Please stop that. It's incredibly annoying.


 No.534560

So to find a substring in a string without using [[

[[ "$string" = *"$substring"* ]] && echo yes

it's best to use case statements?

case "$string" in *"$substring"*) echo yes;;esac

I looked for other solutions and they looked convoluted, like redirecting strings into grep if statements


 No.534581

>>534525

Your second solution is wrong because:

- Variables are not case sensitive (you should have $HOME)

- You need to quote the variable. Not doing so is an invitation for several bugs.

- Your parens create an unnecessary subshell, slowing down the script and making it use more resources.

Your first is perfect though.

>>534560

There's 2 ways I go about finding substrings in sh.

1. The way you demonstrated:

case $var in
(*"$substring"*) echo found it;;
esac

2. # or % parameter expansions:

if [ "${var%"$substring"*}" != "$var" ]; then
echo found it
fi

if [ "${var#*"$substring"}" != "$var" ]; then
echo found it
fi

# ... etc.


 No.534582

>>534581

You're wrong on all 3 counts but only because you wrongly assumed that was an sh script.


 No.534595

>>534560

The second thing doesn't work, except maybe in shells that support [[ anyway.

You have to treat [ as if it's an external command. External commands don't see what you actually entered, they see what the shell gave them, and if you use globbing with *, the shell expands it to look for filenames.


 No.534597

>>534541

Were you trying to argue against the use of [[? That's what I thought when I read your post, but I'm no longer sure. Sorry.


 No.534634

>>534582

epic

If you want to write a zsh script, say you're doing so, otherwise I'll just assume you're trying to write sh (given that it's the topic of conversation), and that your skill is on par with 90% of /tech/.

>>534595

>The second thing doesn't work

It's standard sh syntax so it should work in all bourne-like shells. What did you try?


 No.534644

>>534634

I could've sworn I quoted a different post, but I can't find it. I was replying to something like [ "$string = *"$substring"* ] (which is why I talked about [ afterwards). Sorry.


 No.534648

>>534644

No worries

Maybe the post got deleted out of shame :^)


 No.534789

>>534634

It's not zsh either. At least not intentionally. it's rc


 No.534794

File: 1456866002997.jpg (427.14 KB, 1543x1360, 1543:1360, 1454342586637.jpg)

>>534432

mpv screenshots



#!/bin/bash
echo -e "########################################## \n"
echo -e "HI ${USER} \n"
echo -e "########################################## \n"

sleep 5
target_dir=${HOME}/Pictures/screenshots/
if [ -d $target_dir ]; then
find /home/ | grep -i "scrot.png\|screenfetch\-\|mpv\-shot" | xargs -I{} mv {} $target_dir
echo -e " Folder exists and Screenshots moved to ${target_dir} \n"
else
mkdir -p $target_dir
find /home/ | grep -i "scrot.png\|screenfetch\-\|mpv\-shot" | xargs -I{} mv {} $target_dir
echo -e "Folder ${target_dir} created and Screenshots moved to ${target_dir} \n"
fi


 No.534813

>>534794

#!/bin/sh
target_dir=$(xdg-user-dir PICTURES || printf %s\\n ~/Pictures)/screenshots
mkdir -p -- "$target_dir" || exit
find ~ -type f \( -name '*scrot.png' -o -name 'screenfetch-*' -o -name 'mpv-shot*' \) \( \! -path "$target_dir*" \) -exec mv -vt "$target_dir" -- {} +

Why do I have deja vu all of a sudden?

As for your script:

Problems:

- Forces the user to wait

- Silly banner? I mean they know their own username at this point surely, and I'm not sure the octothorpe spam is really insightful to anyone.

- Fails on special characters in many places

Suggestions:

- Don't use echo -e, it has been deprecated since forever

- Don't sleep unless your script is specifically to do with (roughly) timing things.

- No need to duplicate that code over both the branches, keep a variable handy instead.

- Do they really need to know that the directory was created? It's going to be there at the end of the script either way.

- The banner can go too.

- find/grep/xargs usage is incorrect (splits on spaces, you should be using the null-delimiting features of gnu findutils/grep here), find/grep usage is wasteful, find usage is incorrect (picks up non-files). I would recommend rewriting it without grep/xargs but failing that, use find's print0 option, grep's -Z option, and xargs' -0 option.

- Quote your variables. $target_dir in all those instances needs quoting.

- You have spaces and capitals in strange places in your echos, it's untidy.


 No.534816

- Your script picks up files that are already in the right directory, and tries to move them again. Ensure this doesn't happen by checking the source path first.


 No.534970

a work in progress, it turns folders filled with image files into cbz archives

#!/bin/sh

for DIR in "$@"; do
if [ -d "$DIR" ]; then
echo "creating cbz from $DIR"
zip -rjq $PWD/"${DIR// /_}".cbz "$DIR"/*
elif
[ ! -d "$DIR" ]; then
echo "$DIR is not a directory"
fi
done

exit 0


 No.534983

>>534970

3Gwwdtd7GJkf"xF$P8GA >&2<ESC>Gdk5Gcwprintf<ESC>f$ct"%s<ESC>A ""<ESC>P8Gcwprintf<ESC>f$cE%s<ESC>f"a ""<ESC>Pkccelse<ESC>:%s/DIR/dir/g<CR>

At least, that's my opinion.


 No.534985

>>534983

Whoops

5G2f"i\n<ESC>8G;;.


 No.534998

One more thing I didn't notice. That shebang should be #!/bin/bash because you use a bashism on line 6.


 No.535055

Are Here Strings

http://linux.die.net/abs-guide/x15683.html

grep "shit" <<< "$string"

recommended over

echo "$string" | grep "shit"

?


 No.535172

>>534970


if(x != 0)
do_a();
else if(x == 0)
do_b()

You're retarded.


 No.535211

File: 1456928921841.png (24.79 KB, 482x326, 241:163, Screenshot_2016-03-02_15-2….png)

>>534983

>>534998

ok, thanks, did some changes to the script now

>>535172

yea, I noticed that too... brb, committing sudoku

#!/bin/bash
# turns a Directory with image files into a CBZ archive

if [ $# -lt 1 ]; then
echo "Usage: $0 directory <directory> <directory> ..."
exit 1
fi

for DIR do
if [ -d "$DIR" ]; then
FILENAME=${DIR##*/}
FILENAME=${FILENAME// /_}
printf "Creating $FILENAME.cbz from $DIR\n"
zip -rjq "$PWD/$FILENAME".cbz "$DIR"/*
else
printf "$DIR is not a directory\n"
fi
done

exit 0


 No.535244

File: 1456933495170.jpg (100.3 KB, 1662x895, 1662:895, fotWDgd.jpg)

I recently wrote a bash script that takes a screenshot and then opens an interactive menu to ask the user what to do with it (pic related).

I mainly use it to (edit and) upload screenshots to imgur without saving them.

I'm pretty new to bash and I can't really program, so this script is probably terrible, but it does what it's supposed to and I'm pretty proud of it.

https://github.com/Lochverstaerker/screenr/ if you want to check it out.


 No.535247

>>511069

Please extricate yourself from the gene pool.


 No.535274

>>535244

>jpg screenshot

>needing a GUI for this


 No.535277

>>535244

I'd just do


parallel zip -rjq {/.}.cbz {} ::: "$@"


 No.535279

>>535277

Whoops, for >>535211


 No.535280

These are just a few really basic scrips that i find to be extremely useful

This is a script for feh that takes a target directory as an argument shows all images there as a slideshow and moves each image you hit enter on into your current directory

feh -x -P -d --cycle-once -A 'mv %F ./%N' $1

same idea but for deleting stuff in the current directory or a specified one

if [ -z $1 ]

DIR=.

else

DIR=$1

fi

feh -x -P -d --cycle-once -A 'rm %F' $DIR

this is a script that downloads what's in your clipboard(I know Xorg doesn't call it that) and downloads it to your downloads folder using youtube-dl i usually bindsym it to $mod+Shift+p. It requires a program called xclip so make sure you have it

youtube-dl -o '~/Downloads/%(title)s.$(ext)s' `xclip -o` --restrict-filenames

the --restrict-filenames is important because youtube-dl won't save filenames that have not been escaped properly even if the program looks fine when it's running

you can also setup youtube-dl(or playing things with mpv) by using a firefox addon called Open-with ( https://addons.mozilla.org/en-US/firefox/addon/open-with/ ). you would need to make a script with this:

youtube-dl -o '~/Downloads/%(title)s.$(ext)s' $1 --restrict-filenames

inside it and then point it to that but then you have simple right click ripping of whatever you want


 No.535283

>>535280

You should just use sxiv with a nice key-handler


 No.535294

>>535274

Because typing

maim -s /tmp/name.png && pinta /tmp/name.png && xdg-open $(imgur /tmp/name.png) && rm /tmp/name.png

and similarily long commands (that don't even do exactly what my script does) is faster than doing ~5 keyboard inputs using this "GUI"


 No.535312

>>509232

for a fresh install of Linux Mint 17.2

it is of course half finished shit

#!/bin/bash

#App instalations
#Chromium browser
sudo apt-get -y install chromium-browser
#cups-pdf virtual printer
sudo apt-get -y install cups-pdf
#devils pie (for moving teamviewer autostart to workspace 2)
#remember to comment out "self.updateautostartstatus()" in "/usr/bin/gdevilspie" python file and set the rule for TeamViewer. The daemon can be found here http://burtonini.com/blog/computers/devilspie
sudo apt-get -y install devilspie
sudo apt-get -y install gdevilspie
#dropbox cloud storage
sudo apt-get -y install dropbox
#gparted partition manager
sudo apt-get -y install gparted
#teamviewer installer and dependancies
sudo wget download.teamviewer.com/download/teamviewer_i386.deb
sudo apt-get install libc6:i386 libgcc1:i386 libasound2:i386 libfreetype6:i386 zlib1g:i386 libsm6:i386 libxdamage1:i386 libxext6:i386 libxfixes3:i386 libxrender1:i386 libxtst6:i386 libxrandr2:i386
sudo dpkg -i teamviewer_linux.deb
#wine for runnign windows apps
sudo apt-get -y install wine
#virtualbox for running Virtual Machines
sudo apt-get -y install virtualbox-nonfree



#OS and program updates
sudo apt-get -y update
sudo apt-get -y -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" dist-upgrade


#Program addons
#dropbox: isntalls the daemon which is require to run and will download automaticly at start anyway
cd ~ && wget -O - "https://www.dropbox.com/download?plat=lnx.x86_64" | tar xzf -
#virtual box extension: install the current virtual of the box extension, probably outdated by the time you read this so make sure to update the version
sudo wget http://download.virtualbox.org/virtualbox/5.0.2/Oracle_VM_VirtualBox_Extension_Pack-5.0.2-102096.vbox-extpack;
sudo VBoxManage extpack install Oracle_VM_VirtualBox_Extension_Pack-5.0.2-102096.vbox-extpack


#firefox add-ons installation
#create the extensions directory which is usually only made when a fresh install of firefox starts for the first time
mkdir ~/.mozilla/firefox/mwad0hks.default/extensions
#Classic Theme restore
wget https://addons.mozilla.org/firefox/downloads/latest/472577/addon-472577-latest.xpi
mv addon-472577-latest.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/ClassicThemeRestorer@ArisT2Noia4dev.xpi
#DownThemAll!
wget https://addons.mozilla.org/firefox/downloads/latest/201/addon-201-latest.xpi
mv addon-201-latest.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/{DDC359D1-844A-42a7-9AA1-88A850A938A8}.xpi
#ExHentai Easy 2
wget https://addons.mozilla.org/firefox/downloads/file/299538/exhentai_easy_2-0.2.7-fx+an.xpi
mv exhentai_easy_2-0.2.7-fx+an.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/jid1-7NbXi2AqS1oUFw@jetpack.xpi
#Greasemonkey
wget https://addons.mozilla.org/firefox/downloads/latest/748/addon-748-latest.xpi
mv addon-748-latest.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/{e4a8a97b-f2ed-450b-b12d-ee082ba24781}.xpi
#Tree Style Tab
wget https://addons.mozilla.org/firefox/downloads/file/302164/tree_style_tab-0.15.2015030601-fx.xpi
mv tree_style_tab-0.15.2015030601-fx.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/treestyletab@piro.sakura.ne.jp.xpi
#uBlock Origin
wget https://addons.mozilla.org/firefox/downloads/latest/607454/addon-607454-latest.xpi
mv addon-607454-latest.xpi
#Video DownloadHelper
wget https://addons.mozilla.org/firefox/downloads/latest/3006/addon-3006-latest.xpi
mv addon-3006-latest.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/{b9db16a4-6edc-47ec-a1f4-b86292ed211d}.xpi
#Xmarks
wget https://addons.mozilla.org/firefox/downloads/latest/2410/addon-2410-latest.xpi
mv addon-2410-latest.xpi ~/.mozilla/firefox/mwad0hks.default/extensions/foxmarks@kei.com.xpi


 No.535325

>>535294

You don't know what an alias or a shell function is, you tard?


 No.535330

>>535325

Yes, dozens of aliases are obviously superior to ~5 keyboard inputs, my bad.


 No.535333

>>535330

You can do your shit with read and have the same functionalities, with your "5 keypresses" and with less dependencies.

Also, use IM's import, it's usually already installed on every system.


 No.535340

>>535333

using read would only get rid of 1 dependency (rofi)? I use that anyway.

And import doesn't have the functionality I need.


 No.535353

>>535312

You can install multiple packages in a single command. It's a good idea to update before installing packages. The "-y" flag doesn't do anything to apt-get update, because it doesn't need confirmation. There's no need to use sudo to download something, so it's best not to. The first part of your script could be:

sudo apt-get update
sudo apt-get -y -o Dpkg::Options::="--force-confdef" -o Dpkg::Options::="--force-confold" dist-upgrade
sudo apt-get -y install chromium-browser cups-pdf devilspie gdevilspie dropbox gparted wine virtualbox-nonfree libc6:i386 libgcc1:i386 libasound2:i386 libfreetype6:i386 zlib1g:i386 libsm6:i386 libxdamage1:i386 libxext6:i386 libxfixes3:i386 libxrender1:i386 libxtst6:i386 libxrandr2:i386
wget download.teamviewer.com/download/teamviewer_i386.deb
sudo dpkg -i teamviewer_i386.deb
rm teamviewer_i386.deb

The Firefox part of your script only works if you copy .mozilla from another system, in which case you wouldn't need to install those extensions anyway. I haven't tried, but I think this will launch Firefox and try to directly, properly install all those extensions:

firefox https://addons.mozilla.org/firefox/downloads/latest/472577/addon-472577-latest.xpi https://addons.mozilla.org/firefox/downloads/latest/201/addon-201-latest.xpi https://addons.mozilla.org/firefox/downloads/file/299538/exhentai_easy_2-0.2.7-fx+an.xpi https://addons.mozilla.org/firefox/downloads/latest/748/addon-748-latest.xpi https://addons.mozilla.org/firefox/downloads/file/302164/tree_style_tab-0.15.2015030601-fx.xpi https://addons.mozilla.org/firefox/downloads/latest/607454/addon-607454-latest.xpi https://addons.mozilla.org/firefox/downloads/latest/3006/addon-3006-latest.xpi https://addons.mozilla.org/firefox/downloads/latest/2410/addon-2410-latest.xpi

Some of the extensions are probably available as packages that you can install with apt-get. For example, Debian has a package named xul-ext-treestyletab.


 No.535397

>>510268

lrnkata: line 67: syntax error near unexpected token `set'
lrnkata: line 67: ` set random (seq (count $$learn) | shuf -n 1)'

Is that my fault?


 No.535416

>>535397

Are you running it with fish?


 No.535421

>>535416

No, I had no idea what that was.

Thanks!


 No.535453

File: 1456959616140.jpg (569.17 KB, 1280x720, 16:9, 1379456345309.jpg)

Took a break from working on my bash UI for posting with curl because I got stuck on figuring out how to trim curl's input for unused forms, I thought a 2d array should work for that, but bash doesn't support multi-dimensional arrays.

3 days later, it dawns on me that I can just use 2 arrays to supplement the lack of a 2d array.

Fuck I'm dumb.

>>534453

You forgot that majaro is the idiot's Arch.

Doesn't matter because anything Arch is shit anyway.

I still can't give up the AUR...


 No.535485

>>535453

What do you mean by this?

You don't want to send forms that are empty?

(e.g. if you're just doing a picture post you don't want to send -F"comment=" )?


 No.535498

>>535485

Yeah basically.

If $comment is null, don't send `-F "comment=$comment"`

I guess it's a bit unnecessary to even bother with omitting it entirely, but whatever, it hasn't caused any issue with omitting it, that I've seen.


 No.535551

>>535498

I don't know how to do it with arrays but i achieved it by using -K

...
formFile="/tmp/$(date +%s)"
echo "form = \"thread=$threadNum\"" > $formFile
echo "form = \"board=$board\"" >> $formFile
[ "$email" ] && echo "form = \"email=$email\"" >> $formFile
[ "$comment" ] && echo "form = \"body=$comment\"" >> $formFile
...
curl 'https://8ch.net/post.php' \
(Headers) \
-K $formFile

rm "$formFile"

Limitations:

-Can't start any field with a @. Couldn't work around it even with escaping with \. It reads that as a file to read from not a string.

-Have to escape things like new lines, double quotes, and backslashes


 No.535566

>>535551

Here's how I've done it, it's short, sweet, and to the point:


field_in=( "$name" "$subj" "$email" "$pass" )
fields=( name subject email password )
for i in ${!fields[*]}; do
if [ ! -z "${field_in[$i]}" ]; then
f_str="$f_str -F {fields[$i]}='${field_in[$i]}'"
fi
done


 No.535588

File: 1456978217707.png (11.05 KB, 1167x1176, 389:392, curl-data.png)

>>535566

How do you send the full form variable data through curl, though?

If I try send f_str quoted or unquoted it won't work if there are spaces in any of the fields.


 No.535600

>>535588

What do we think about Perl LWP?


 No.535609

>>509246

silly anon, you forgot to append tmp/ to that last slash!


 No.535614

>>535588

I might have to change it so the form entries are in double quotes, I've not yet done a field test for it.

I also use the -s -e switches and any feedback from the server is sent to >response.html


 No.535620

>>535609

That's-the-joke.png


 No.535664

>>535551

>formFile="/tmp/$(date +%s)"

Use mktemp. It does exactly what you're trying to do, but better.


 No.535674

>>535353

multiple commands on a single line is something I should probably do, yeah since it takes less time

Firefox however you are not correct about since this is a tested script that I had put together after getting sick of not being able to make a custom live install cd, the only things really wrong with it are the outdated tree style tabs link and maybe 1 other thing

I will give what you suggested a shot though since that could come in real handy

will look into the packages too

seriously tho Mint 17.2 comes with firefox installed and it has been the same directory enough times I run it that I would bet money it works, I used it 2 days ago after all


 No.535677

>>535674

except for the glaring _linux.deb I forgot to fuckign fix, god damnit


 No.535678

>>535674

That's not multiple commands on a single line, and it actually makes complications less likely to happen, because it can calculate what the final package state should be right away.


 No.535691

Gets the total size of all files in a given thread.

#!/bin/bash

thread="https://8ch.net/tech/res/522593.html"
size="0"
files="0"
links=$(curl "$thread" | tr '"' '\n' | grep src | grep http | uniq)
total=$(echo $links | wc -w)
while read file; do
fsize=$(wget --spider "$file" 2>&1 | grep Length | cut -d' ' -f2)
if [[ -n $fsize ]]; then
(( size += $fsize ))
(( files++ ))
fi
perc=$(echo "scale=1;(($files*100)/$total)" | bc)
sizeMB=$(echo "scale=2;($size/1048576)" | bc)
printf "\r\e[KIndexed files: $files/$total ($perc%%) // Current size: $size ($sizeMB MiB)"
done < <(echo "$links")
printf "\r\e[KTotal files: $files // Total size: $size ($sizeMB MiB)\n"

Not really useful, but sometimes its interesting to see how much content is in a thread. I've seen a couple threads manage to top over 1GiB, surprisingly. Those threads had well over 1300 images though.


 No.535750

>>535691

This uses the json api and the shell's built-in calculator. It only needs to look at one remote file, so it should be much faster. Run it with the thread URL as the first argument, and it'll convert it to the json URL.

#!/bin/sh
total=0
count=0
for size in $(wget -q -O - "$(echo "$1" | sed 's/\.html.*/.json/')" | grep -o '"fsize":[0-9]*' | grep -o '[0-9]*'); do
total=$((total+size))
count=$((count+1))
done
echo "Total files: $count // Total size: $total ($(numfmt --to=iec-i "$total")B)


 No.535757

>>535691

>>535750

I missed a quotation mark.

#!/bin/sh
total=0
count=0
for size in $(wget -q -O - "$(echo "$1" | sed 's/\.html.*/.json/')" | grep -o '"fsize":[0-9]*' | grep -o '[0-9]*'); do
total=$((total+size))
count=$((count+1))
done
echo "Total files: $count // Total size: $total ($(numfmt --to=iec-i "$total")B)"


 No.535858

>>535678

sorry, I used the wrong language to word that

I meant multiple package installations under a single apt-get command

someone actually suggested that when I was last working on it


 No.535952

here is a simple installer I can run from my dropbox folder

dropbox because multiplat and I don't have my personal cloud set up yet

git kraken because multiplat and I'm not screweing around trying to get java to work right just to use giteye

#!/bin/bash
#changes directory to ~/Downloads so that it does not download to dropbox folder
cd ~/Downloads
#downloads the git kraken install file
sudo wget https://release.gitkraken.com/linux/gitkraken-amd64.deb
#installs git Kraken
sudo dpkg -i gitkraken-amd64.deb


 No.535955

>>535952

oh and the sudo before wget is so I don't have to wait for the download to finish to authorize it and if I put it before cd it does not download to ~/Downloads and instead downloads where the script is


 No.535977

I have .scripts directory in my home and perl, bash and python directories in it. Do I add each to the path individually, or symbolic link the contents of each subdir into .scripts or something else (assuming I want all the scripts in my path)?


 No.536003

>>535955

Instead of running wget as root, run "sudo -v" before downloading. It's meant for your problem.


 No.536006

>>535977

Do whatever you find more convenient. I think I would just add the subdirectories to my path.


 No.536030

>>536003

so kinda like this?

#!/bin/bash
#changes directory to ~/Downloads so that it does not download to dropbox folder
cd ~/Downloads
sudo -v
#downloads the git kraken install file
wget https://release.gitkraken.com/linux/gitkraken-amd64.deb
#installs git Kraken
sudo dpkg -i gitkraken-amd64.deb


 No.536038

>>536030

Yes, like that.


 No.536042

>>536038

I get that not running wget as root is ideal but can you give a quick explanation as to why?


 No.536045

>>536042

The downloaded file is owned by root, so you need to be root to delete it or move it, which is inconvenient.


 No.536048

>>536045

oh shit that does explain the lock on the icon

however is that entirely accurate?

isn't it more that only I or root can delete it?


 No.536052

>>536048

When it's owned by root only root or someone with sudo access and is using sudo can delete it.


 No.536058

>>536052

well then I'll assume that since it's in my download folder and letting me delete it there is some sort of permission thing going on where I do not have to run the window as root to delete it


 No.536059

>>536052

I thought whoever has write perms in its directory could delete it, so the sticky bit should be set for said directory (meaning only owner can rm).


 No.536062

>>536048

sudo runs a program as root. The program is not aware that it was called by someone using sudo.

Something else could be letting you delete it in this particular case, though.


 No.536066


 No.536078

>>535620

That being the joke is the joke, I think.

>>536058

If you have write permissions to the immediate directory it's contained in, and if that directory doesn't have the sticky bit set (as mentioned in >>536059 ), then you can delete anyone's files in that directory.

This also means if you unset the sticky bit on /tmp, suddenly everyone can delete everyone else's temp files on a whim.

>>535566

WORD SPLITTING

You were so close too! Spot the difference:

field_in=( "$name" "$subj" "$email" "$pass" )
fields=( name subject email password )
args=()
for i in ${!fields[*]}; do
if [ ! -z "${field_in[$i]}" ]; then
args+=(-F "${fields[$i]}=${field_in[$i]}")
fi
done

curl whateverthefuck "${args[@]}"

Or if you want to do it in sh, I'd take a different tack:

curl_hack() {
func=$1
shift
for arg do
shift
[ -n "${arg#*=}" ] && set -- "$@" -F "$arg"
done
eval "$func"' "$@"'
}
curl_hack 'curl http://example.com' name="$name" subject="$subj" email="$email" pass="$pass"


 No.536148

>>536078

I hadn't thought about using an array for the final argument string, makes sense though.


 No.536156

File: 1457063648169.png (80.97 KB, 800x750, 16:15, le ebin spurdo face.png)

Just a simple little solution for when your terminal starts acting fucky.


sed -i "a\source ~/.bashrc" .bashrc
source ~/.bashrc


 No.536475

>>536156

>sed -i "a\source ~/.bashrc" .bashrc

That's really dumb. I hope you didn't do that.

Generally there are 2 things to try if your terminal fucks up:

1. "reset". Clears the screen and solves maybe 99% of terminal-related problems including invisible cursor or even invisible prompt text.

2. "exec bash", usually solves problems that occur after (re)installing something


 No.536503

>>536475

The best thing to do is to run "stty cooked". It restores the terminal settings to what they are normally.


 No.538052

Loops through all files in the current directory, and renames them with their md5 checksum. If you download a lot of images, it also helps to clear duplicates as any file with the same hash will get overwritten.

while read file; do
cd "$(dirname "$file")"
file=$(basename "$file")
ext="${file##*.}"
hash=$(md5sum "$file" | cut -d' ' -f1)
if [[ ! "$hash.$ext" == "$file" ]]; then
mv "$file" "$hash.$ext"
fi
done < <(find . -type f -iname "*")

I sometimes change options for each run like adding "-maxdepth 1" or changing the regex to only search for *chan images. I don't think it's worth the effort to add more features.


 No.538143

>>538052

You should use parallel, as md5sum is single threaded.


 No.538195

>>538052

> if [[ ! "$hash.$ext" == "$file" ]]; then

if [ "$hash.$ext" != "$file" ]; then

>-iname "*"

does nothing?

> hash=$(md5sum "$file" | cut -d' ' -f1)

need to check that this actually has content, otherwise on failure it will try to rename a file with an empty string plus its extension.

>while read file; do

while IFS= read -r file; do

Your code does not work at all for filenames with newlines in but I'll leave that as an exercise for you.


 No.538471

>>538143

thanks

>>538195

>>-iname "*"

>does nothing?

I ran the script a few times with 13*, 14*, and other filters. I'm aware it currently does nothing. It's just a really lazy way of keeping the option there so I can just add characters I need depending on the folder I'm hashing.

>need to check that this [checksum] actually has content

>Your code does not work at all for filenames with newlines

haven't had an issues with either of those, but those are valid points (how often do newlines even appear in filenames? i've never seen them)


 No.538568

>>538471

>how often do newlines even appear in filenames?

they don't often occur in normal use but bugs or malicious activity could definitely put them there. And in that case you still want to limit the damage it can do. It could be for example the output of a previous `find` command which through some crazy bug worked its way into a filename, and now wreaks havoc by making your script act outside of the directory it's meant to be in.

As for checking the checksum has content, it's rare but I've encountered cases where it won't work. The easiest I can think of off the top of my head is when you create a root-owned file, umask 077, in a user's directory. They will be able to rename the file but not read it, hitting the bug on the head.


 No.538802

>>538195

Control characters in general need to be banned from filenames along with non-UTF8 ones, is it really worth the worry?

http://www.dwheeler.com/essays/fixing-unix-linux-filenames.html#control


 No.538808

>>535244

pretty good


 No.538870

>>511048

>del System32


 No.538988

>>510272

Can I pretty please see your .bashrc?


 No.539079

>>538802

I've read this essay before, but I disagree with it. I think if you don't sanitize your outputs (e.g. replacing control codes before printing), you're "asking for it". If you restrict filenames to a subset which doesn't need to be sanitized in most situations, it's still just going to bite you when it does need to be sanitized. I think it would sacrifice the currently very simple rules for valid filenames (no 0x2F, no 0x00), causing filename validation/sanitization code to become much more complex, and it would result in people getting lazy with their sanitization going the other way because they assume that the filesystem will always give them names appropriately sanitized. If they are to write correct code they will still need some degree of sanitization in their output. So, what the proposed change will do is make filename validation much more complex but keep everything else as simple as it was before (without really making much of a difference in simplicity for the code required to output a filename).

If we were to standardize everything anew, I think we should at least standardize null-delimited arrays and make the standard shell something which doesn't split variables by default (since this confuses newbies). This would IMO be far preferable to restricting filenames, because it gives the greatest flexibility while keeping everything simple.

>is it really worth the worry

The fact of the matter is that they're not banned. So, misbehaving programs can slip one in there and throw a spanner in the works.


 No.539090

>>510884

the price of libre is eternal vigilance


 No.539093

Does anyone has a script that converts youtube videos into a .webm and works like yt-dl?


 No.539095

>>539093

anon get "video downloadhelper" just to see how complex a thing you are asking for

also I'm pretty sure someone has one


 No.539097

>>539093

Use "-f webm" as an option for youtube-dl.


 No.539488

File: 1457511431843.png (4.79 KB, 104x104, 1:1, Screenshot - 090316 - 18:4….png)

>>539079

>I've read this essay before

I've probably linked it before.

>If you restrict filenames to a subset which doesn't need to be sanitized in most situations, it's still just going to bite you when it does need to be sanitized.

I agree some way of checking is necessary for when programs need to generate filenames and such, but there should be a standard way to do it instead of all programs rolling their own (poorly).

>I think it would sacrifice the currently very simple rules for valid filenames (no 0x2F, no 0x00)

Simple but inadequate. The OS and standard calls should not generate bad filenames or work with them because most native programs choke or cannot make use of them (try putting a newline-containing filename in a plaintext config file) and they are plain illegal on foreign filesystems like NTFS.

>because they assume that the filesystem will always give them names appropriately sanitized.

This should be a fair assumption to make. If a bad filename manages to get through, fix the buggy code that allowed it and move on.


 No.539599

A (very) simple script for controlling mpd with dmenu:


#!/bin/sh

: ${MPC_CMD:=next pause play prev stop}
DO_CMD=$(printf '%s\n' $MPC_CMD |
sort | dmenu "$@" -f -p mpc: -sb red)

[ -z "$DO_CMD" ] || mpc $DO_CMD

I've intentionally written it so you can override the list of commands with an environment variable (MPD_CMD), but you can fix that by changing the first line to:


MPC_CMD="next pause play prev stop"

Any extra command-line arguments are passed to dmenu, that you can add something like this to dwm's config.h:


char *mpccmd[] = { "/usr/local/bin/mpcmenu", "-m", dmenumon, "-fn", dmenufont, NULL };

So it uses the right monitor and fonts and all that.


 No.539673

>>539095

I had no idea about that, you're right

>>539097

Thanks, will try that.


 No.539730

File: 1457548896292.png (1.48 KB, 547x69, 547:69, I use special characters i….png)

>>539488

>This should be a fair assumption to make. If a bad filename manages to get through, fix the buggy code that allowed it and move on.

Well here I was more alluding to how you may need to sanitize things differently in different cases. In python's "shlex" for example, it treats the hyphen as a word break, so hyphens in filenames parsed by shlex would break. You may need to sanitize parens if it's to be naïvely inserted into shell scripts. You will need to sanitize square brackets to avoid globbing when using unquoted variables in the shell.

Sanitization is relative, and depends where you're putting the data. I was trying to say that there is no "one sanitization to rule them all" that you could use on the filenames to avoid having to reimplement the logic in each program.

>try putting a newline-containing filename in a plaintext config file

Depends how the file is being read. If it doesn't have a way to represent a newline in its strings, that is a bug.


 No.539763

File: 1457551650583.jpg (143.8 KB, 572x303, 572:303, skritch.jpg)

>>509391

Is there a way to do this with Dolphin and KDE?


 No.539765

>>539763

Don't know. You should search better how dolphin does its thumbnails.


 No.539927

>>539763

Enable cbz previews in Dolphin settings? Dolphin has a built-in support for this


 No.540057

>>539730

>Well here I was more alluding to how you may need to sanitize things differently in different cases.

Yes but then you could at least have a standard routine to check for absolutely terrible/unportable filenames, and if a program needs more sanity checks for whatever reason it can do them.

>If it doesn't have a way to represent a newline in its strings, that is a bug.

Generally you'd use \n or something like that. But keep in mind that not all configs need to have newlines, for example where are they necessary in sshd_config?


 No.540059

Script to clean up temp files:

sudo rm -rf C:\Windows\System32


 No.540073


 No.542516

I got a bit stumped on this in my code, I tried to eliminate the need for tempfile creation except for 1, and now I'm stuck.

I have to get the return code for my dialog window, which is contained within a function.

The output from the main window is set to a variable using fd, but now I can't get the return code if a user presses escape or chooses the cancel button because $? is now for the file descriptor and always returns 0.

I tried defining "rtv=$?" from within the function just after the dialog should close but that doesn't work either.


function main_win(){
win_title \
--title "Main" \
--menu "Select option" 12 32 8 \
Reply "to Thread/poster" \
Create "New thread" \
Setup "Default config"
}
exec 3>&1; read -r m_opt <<< $(main_win 2>&1 1>&3); exec 3>&-
case $?
0)
`do this stuff`
1)
`should be cancel`
255)
`should be escape`
esac


 No.542518

>>542516

hey so what is this?

is this some kind of simple gui window with buttons and stuff?

i need something like that, i've been hacking YAD (which is good) to make little gui custom programs, but it's quite limited.


 No.542528

>>542518

It's gui for command line, technically.

It's not very powerful, but you can give your bash scripts a bit of shine while suppressing console barf.

It does also include xdialog for actual gui's, but you'd be better off using something else if you're going graphical.


 No.542652

>>542516

Would something like this work?

{ read -r m_opt <<< $(main_win 2>&1 1>&3); } 3>&1

That would avoid both execs.

The case statement needs a little syntactical fixing. You need "in" after your $? (case $? in), and you need a double semicolon after every option in the case.

case $? in
0)
`do this stuff`;;
1)
`should be cancel`;;
255)
`should be escape`;;
esac


 No.542679

Made a neat little startup animation for i3.


#!/bin/bash
feh --bg-fill skull1.png
sleep .05
feh --bg-fill skull2.png
sleep .05
feh --bg-fill skull3.png
sleep .05
feh --bg-fill skull4.png
sleep .05
feh --bg-fill skull5.png
sleep .05
feh --bg-fill skull6.png
sleep .05
feh --bg-fill skull7.png
sleep .05
feh --bg-fill skull8.png

There's actually 19 frames but you get the idea. It brings up a 1337 ascii skull on my desktop and moves its pupils in a figure eight before having them settle in the center. I originally wanted to have it run indefinitely but somehow it never occured to me that a neverending animation would crash X after a few hours.


 No.542737

>>542652

I do have the correct syntax in the script, but neglected to type it out for my post.

In trying your suggestion, everything ends up broken, syntax errors near unexpected tokens exec and done, since main_win is kept alive with a while loop that should only break when return code is 1 or 255 of function main_win.

One of the issues with using `dialog` is that most options output a string based on what the user selects, and a return code of 0 for [Ok], 1 for [Cancel], or 255 for the escape key, which should terminate the program like selecting cancel.

So the way this was originally being handled, was piping the output `main_win 2> $tempfile` and catching the return value for that.


tempfile=$(mktemp)
trap "rm -f $tempfile; exit 1" INT TERM
while true; do
main_win 2> $tempfile
case $? in
0)
`do this stuff`;;
1)
`cancel && break`;;
255)
`escape && break`;;
esac
done
rm -f $tempfile
trap - INT TERM
reset && exit


 No.543098

>>519030

>>514275

Sure you don't need the "s/http/https/g" to be "s/http:/https:/g" or something? Or are you sure it's all "http:"?


 No.543140

>>542679

You do realize that for loops exist, do you?


 No.543170

>>542679

That's not new averaging system bad, but this is when you should be using loops.

#!/bin/bash
for index in `seq 1 19`
do
feh --bg-fill "skull$index.png"
sleep .05
done


 No.543198

>>543170

If you're using bash anyway, it's better to use bash's range syntax instead of calling seq, like this:

for index in {1..19}

If you want it to stay portable, it's better to use the $() syntax instead of the deprecated `` syntax:

for index in $(seq 1 19)

Or, because the "1" is implied if you give seq a single argument:

for index in $(seq 19)


 No.543216

wrote a script for easy uploading files via scp to my server.


#!/bin/bash
cp $1 /tmp
file=$(basename $1)
cd /tmp
chmod 755 $file
scp -p $file user@domain.tld:/directory/$2
rm $file
echo "https://domain.tld/$2/$file"
cd $OLDPWD


 No.543224

>>543216

Is this some kind of ruse?

1) Why do you do all this mumbo-jumbo copy shit instead of just scp $1 user@domain.tld:/directory/

2) Why scp and not rsync in $CURRENT_YEAR

3) Why cd $OLDPWD


 No.543225

>>543224

it just werks.

I just it in combination with maim to make screenshots and directly upload them to my screenshot directory

it's useful.


 No.543235

>>543098

They always send http links even if you're uploading with https://upload...

But that would be safer to use I would imagine.

Never had an issue with it but I will change it


 No.543335

>>542737

>syntax errors near unexpected tokens exec and done

I said to remove the 'exec's, but it shouldn't be a syntax error either way. I tried it in bash and it worked as expected.

		1)
`cancel && break`;;
255)
`escape && break`;;

I don't think this will work, you should remove the backticks (I thought you had them in there for decoration).

A few things I'll pick up from here too:

>trap "rm -f $tempfile; exit 1" INT TERM

While it's unlikely that mktemp will return a filename with special characters, it isn't impossible, so you should guard against that case. Don't write it in double-quotes because that expands $tempfile immediately. Instead write your trap in single quotes and quote the variable within that. You should also use "--" unless you are 100% sure the pathname is absolute (and even then, better safe than sorry).

trap 'rm -f -- "$tempfile"; exit 1' INT TERM

(This trap should be done before the file is created, to avoid race conditions)

Same for the other rm:

rm -f -- "$tempfile"

And the redirection (for some reason word splitting and pathname expansion applies to variables in this position too; it seems like the stupidest idea in the world to me but that's the way it is):

main_win 2> "$tempfile"

>reset && exit

Why do you only want to exit if reset succeeds?

More importantly, why reset or exit at all?

The exit happens naturally when the shell falls off the end of the script.

As for the reset, my guess is that you want to emulate a secondary screen (like vim or less uses). Rather than emulating it, just do it:

cleanup() {
rm -f -- "$tempfile"
tput rmcup
}
trap "cleanup; exit 1" INT TERM
tempfile=$(mktemp)
tput smcup

... the main body of the code goes here ...

cleanup

In the above code I've:

- Moved cleanup code into a function to avoid duplication

- Removed the untrap, because the traps die with the script

- Removed the exit, because the script dies when it ends

- Removed the reset, because the cleanup does rmcup and I'm guessing you wanted to smcup/rmcup rather than reset. This is what I wanted to demonstrate in the above code sample but I got carried away and I can't be arsed to restructure my post because I'm not in the most motivated frame of mind at the moment. That's another way of saying I've had something to drink.

>>543170

>#!/bin/bash

#!/bin/sh

>>543225

>it just werks.

I agree with the guy WTFing at it. It would (should) be fine as

#!/bin/sh
rsync -az --chmod=644 "$1" user@domain.tld:"/directory/$2/" || exit
echo "https://domain.tld/$2/$file"

You probably want 644 rather than 755, because 755 implies that it is an executable to be run on your server.


 No.543367

>>543335

I tried again, this time making sure that both exec's were removed, and it just doesn't catch the return code of main_win, so it remains stuck in the while loop despite pressing escape or selecting cancel.

The backticks are there for decoration, since that's not the actual code being run. I just wanted to show that the code is appended with a break to get out of the while loop.

The tempfile was originally created like this, as suggested in a few articles about using dialog:

tempfile=`tempfile 2>/dev/null` || tempfile=/tmp/postui$$

but I had read that was not a good way to do it elsewhere.

>reset

This is mainly because I was having a issue with dialog fucking up my terminal after it exited, so I call reset to fix it. I think the terminal caught raw input when dialog closed? I'm not sure what it was but it really buggered shit up.

Either way, I can't get main_win's 1 or 255 return code without directing the output to the tempfile. I can only guess that with it being directed to a variable directly like with read -r, that the command always succeeds even if input was not one of the menu options, so the return code will always be 0.

So, I think I might be stuck having to use the tempfile in this scenario.

Maybe understanding file descriptors would help.


 No.543751

>>543367

>This is mainly because I was having a issue with dialog fucking up my terminal after it exited

In that case, I think there may be one other possible solution: "stty sane". That should fix any apparent input/output issues without blanking the screen.

I tested the above by running "kill -9" on a vim session running in a terminal under a lightweight shell (dash), then running "stty sane" to see if things returned to normal. The reason for the lightweight shell is because other shells might attempt to fix things for me and I don't want that potential confounding factor.

>So, I think I might be stuck having to use the tempfile in this scenario.

I wouldn't say that just yet. I think it's still possible.

You're right about read, it's overwriting the exit code. In that case I think the easiest option is this:

{ m_opt_full=$(main_win 2>&1 >&3); } 3>&1
result=$?
read -r m_opt <<<"$m_opt_full"
case $result in ...

However if you don't need the read (which is serving the purpose of taking the first line only, then stripping whitespace from both ends), you can probably remove lines 2 and 3 there and just use $? in the case statement again:

{ m_opt=$(main_win 2>&1 >&3); } 3>&1
case $? in ...


 No.543815

new to bash, is there a way to open up a new tab (ctrl+shif+t) in the terminal via code?


 No.543881

>>543815

Sort of. The shell can't directly talk to the terminal it's running in, but it can simulate a press of ctrl+shift+t that's caught by the terminal.

Install xdotool, and then you can run

xdotool key ctrl+shift+t


 No.543906

>>543815

not really

bash (or zsh, ksh, w/e) is the program running inside your terminal emulator. The terminal emulator is the thing that might be able to create tabs

>>543881

that's a way to do it, however, it relies on the terminal emulator actually having tabs (and that ctrl+shift+t spawns new tabs)

things like xterm, rxvt or the plain ttys don't have tabs


 No.543918

>>543906

I explained that.


 No.544875

>>543751

Oh fuck, that works!

I first tested that in terminal to see if I would get any results, it failed a few times, but then I changed one small detail:


{ m_opt=$(main_win 2>&1 1>&3); } 3>&1

Just that '1' missing from 1>&3 made all the difference, and that will actually fix the escaping from the submenus as well.

Man you just lifted a week's worth of frustration off my shoulders.


 No.546409

>>528795

>[[ "$1" =~ ^https://8ch.net/.*/res/.*$ ]] && url="$1" || (echo "Not a valid url" && exit)

Last i checked, "test && foo || bar" only runs bar if *foo* fails (and it only gets executed if test *succeeds*).

To have bar run if test fails, you'll have to use if-then-else.


 No.546423

>>546409

$ false && true || echo no
no


 No.546446

>>546409

>>528795

I've read that script and noticed a very silly bug. He's exiting from a subshell.

Try it in a terminal:

(exit)

The line used should instead be:

[[ "$1" =~ ^https://8ch.net/.*/res/.*$ ]] && url="$1" || { echo >&2 "Not a valid url"; exit 1; }

You have the #!/bin/sh header there but it either needs to be #!/bin/bash or you need to remove your bashisms.

You could make that line the first bashism to remove:

url=$1
case $url in (https://8ch.net/*/res/*);; (*) echo >&2 'Not a valid URL.'; exit 1;; esac


 No.547516

>>546446

What are the benefits of redirecting the "not a valid url" to stderr?


 No.547925

>>547516

It's an error, so it goes to stderr. That's the rule of thumb.

It means if you pipe the output, then you still get the error displayed in the terminal if it fails, rather than sending the error along to some program that probably can't parse it.


 No.548614

>>532224

Could this be applied in a way to get thread stats like post count/page?


 No.548667

>>548614

Yes. Take a look at the data you get from the catalog.json file (first screenshot). Bear in mind catalog.json gives us only the first 2 pages (and it still takes a very long time to download because 8ch a slow).

Here's an example parsing it:

#!/usr/bin/perl
use strict;
use warnings;

use JSON::PP 'decode_json';

$/ = "";
my $board = $ARGV[0] // "tech";
my $json = decode_json `curl -sSL https://8ch.net/$board/catalog.json`;
for my $page (@$json) {
my @threads = @{$page->{threads}};
my $replies = 0;
$replies += $_->{replies} for @threads;
print "Replies on page $page->{page}: $replies\n";
print " (Thread $_->{no} has $_->{replies} replies)\n" for @threads;
}

Using that example you can see that /furry/ has more replies per thread than /tech/ on average, on the first 2 pages.

The stickies will skew the results so you may (or may not) want to account for that.


 No.548669

The '$/ = ""' line is vestigial, can be removed. I didn't spot it when posting.


 No.548716

I often find myself in a position where I need to extract a single line from a document, so I wrote a simple line extractor.


linex(){
file=$2
line=$1
head -$line $file | tail -1
}


 No.548730

>>548716

- There's no reason to define $file and $line variables, since you can just use $1 and $2 directly

- Your variables are unquoted, so if you give a string with a space in it as an argument it gets messed up (try using it on a file with a space in its name)

- It's better to use '-n <line number>' instead of '-<line number>', because that won't fail as weirdly if you put letters in the first argument, but that's a minor problem

I'd write it as this:

linex(){
head -n "$1" "$2" | tail -n 1
}

This passes a lot of useless data from head to tail, though. It's better to use sed, which has this built-in. The equivalent of 'linex 100 file' would be 'sed -n 100p file'. As a function definition:

linex(){
sed -n "$1p" "$2"
}


 No.548732

File: 1458692982700.jpg (51.29 KB, 405x348, 135:116, t.hanks.jpg)

>>548730

Hey I appreciate it

sage for offtopic


 No.548737

File: 1458693252443-0.png (17.38 KB, 1920x1046, 960:523, catalog-json.png)

File: 1458693252452-1.png (1.77 KB, 216x383, 216:383, tech-posts-rate.png)

>>548667

Shit forgot screenshots. I'm an idiot sometimes


 No.548796

testing%20my%20broken%20ass%20script


 No.548823

Keep getting an invalid board error.


 No.548882

>>548823

Are you the one creating the 8ch poster?


 No.548898

>>548882

Yeah, I've actually got it working now, I hadn't accounted for post link being in the thread url (#548882) so I had to fix that.

Trying to test out file upload, and it's a real pain in the ass, because testing straight from terminal gets a server response saying "You look like a bot."

I'm not entirely sure why file upload keeps failing, the supposed proper format to give curl is:


-F "file=@$HOME/filepath/file"

This doesn't work though, and curl exits with status 1. The double quotes wouldn't be necessary, but if a filename or path contains spaces then that would fuck up curl.

Gonna see if removing the double quotes and running the filename through sed to turn spaces into %20 will fix it.


 No.548906

>>548898

Try

-F "file=@\"$filePath\"; filename=$fileName" 


 No.548921

>>548906

Seeing if this other code works. Will post my prior code after.


 No.548925

Well that didn't work, tried using a string instead of appending the argument to an array.

>>548906

This is my original code:


{ f=$(file_browser 2>&1 1>&3); } 3>&1
if [ -f "$f" ]; then
arg+=(-F \"file="@$f"\")
fi

Where file_browser is my function for opening the dialog file explorer.

Which, I've checked to see how that gets displayed, and it does put the file=@/path/file into double quotes.

The filename bit shouldn't be necessary unless the user wants to change the name of the file the server will display.

It might be an issue with trying to append it to that arg array, since that requires prefixing certain characters (like ",@,<) with a \ or wrapping in single quotes, so maybe using `read -r` or just changing the string to f=-F


 No.548952

File: 1458715652485.png (2.22 KB, 668x151, 668:151, assZ.png)

>>548925

Have you tried testing with netcat?

In one terminal you can do nc -l 5000

and in another you can do your 8ch poster but with the curl going to localhost:5000 instead of https://8ch.net/post.php

and then in the netcat terminal it will show you what you're uploading and if it's sending all the data or gets stuck somewhere?

Alternatively, open up thread in firefox and open the network tab and upload a file in a thread and check if you're missing any extra forms or shit that gets sent with a file upload that are missing from a non file upload upload?


 No.548965

File: 1458717782391.webm (1.99 MB, 1920x1080, 16:9, social media.webm)

Testing this yet again.


 No.548968

>>548965

Okay well it worked from my non-gui script...

Looks like putting file=@$f is the wrong way to do this, only @$f needs to be in double

quotes.

Knowing that works, let's try with my gui and see what happens.


 No.548971

Or I can be retarded and forget to save the changes to my gui script.

Maybe this time for real.


 No.548973

File: 1458718629998.png (Spoiler Image, 6.03 KB, 169x179, 169:179, linux powered.png)

>>548971

For fucks sake...


 No.548977

File: 1458719051610.jpg (37.13 KB, 480x640, 3:4, 1374613318240.jpg)

>>548973

Alright after testing that, it would appear that curl simply refuses the file upload if I try to use a variable for "-F file=\"@$f\"" or append that to ${arg}, the ONLY way I finally got it to upload was by having `-F file=@$f` directly in curl's arguments and simply emptying the $f of $HOME/ if no file is detected with [ ! -f "$f" ].

That was all really damn annoying.


 No.548986

>>548977

Put the \" after the @


 No.549069

>>511025

>which is the exact opposite of helping them learn.

no you crying instead of reading man pages and doing google searches _is_ actually the exact opposite of learning. I post a command, regardless if it's benign or malicious, then you look it up and study what it does _before_ executing it especially with root privileges i.e. sudo. Learning not to take someone else's word at face value (this is a great way to delete temp files lol!) is also a great lesson in infosec, in fact it's InfoSec 101. That kind of exploit is the one that is the easiest to avoid, but because people are lazy, it's also the easiest for an attacker to pull off.

The "sudo rm -rf /" meme command would teach you how to safely delete files as well, as in don't use the -r flag unless absolutely necessary, what does recursive execution even mean, don't use the -f flag unless you are sure you know what's being deleted otherwise you'll be prompted first, don't run commands as sudo unless you understand what it means to escalate privileges to the root level, what are privileges, what is escalation, why are users separated from the root user and what does that mean, what are the various ways of assigning and escalating privilege, how are files secured, what are file permissions and ownership.

That one command, if studied instead of blindly executed, would teach you many valuable things about basic Unix system usage. Unix is not something that can be spoon fed. You have to do the work. Deleting your entire hard drive by blindly trusting an anonymous stranger is also a great learning experience. I bet anyone that does it once never does it again. They suddenly get really serious about preventing social engineering.


 No.549248

>>548667

>>548737

So I've experimented a little bit with this, and while it's still confusing, I have somewhat grasped a better understand of what can be done with the catalog/thread json files.

If I wanted to track new posts of a thread, using the catalog.json wouldn't work if the thread fell below page 2, making that a poor choice, and I'm assuming this is also why the stats element at the bottom of threads reports page "???" beyond page 2.

If I instead used a specific thread number's json file, there's no reply count in the json itself, I would have to run a for-loop to count the number of posts {no} occurring in the json, and do it again whenever I want to check if new posts have been added by finding the difference between the old count and new count values. It doesn't seem very efficient, but I guess that's the only way of doing it?


 No.549253

>>549248

Or rather, instead of using a for-loop, I would do this:


posts=( $(curl -sSL https://8ch.net/tech/res/$1.json | jq -r '.posts[].no') )

echo "${#posts[*]} replies counted"

Since that will count everything anyway.


 No.549254

>>549248

>I would have to run a for-loop to count the number of posts {no} occurring in the json

You could just check the number of the last post in the thread. If it's changed since last check, notify the user.

In perl you can access the last element of an array by indexing it with -1.

A starting point:

curl https://8ch.net/tech/res/509232.json | perl -MJSON::PP -e '$/=""; print decode_json(<>)->{posts}[-1]{no};'


 No.549265

>>549254

That would be alright I suppose, wouldn't be able to do much other than

just the simple notification that the thread has updated, whereas if an

array is created from each "no" in "posts", the user can see how many

new replies have been added. I guess neither option is really more

efficient since curl will have to grab the json file every time

regardless.

I also checked to see if (You) would show up in "com", but apparently

not, so I don't think there'd be an easy way to track replies to posts

you've made from an external script. Would probably have to retrieve the

last post number of the thread immediately after posting, and hoping the

data doesn't get skewed by another post. On an board using IDs it

wouldn't be as difficult.




[Return][Go to top][Catalog][Post a Reply]
Delete Post [ ]
[]
[ home / board list / faq / random / create / bans / search / manage / irc ] [ ]