Batch: mit bat entpacken

peterpan36

Grünschnabel
hi
bin gerade neu im forum und hoffe ihr könnt mir hier weiterhelfen, da ich durch suchen nichts erreicht habe.
und zwar möchte ich eine batch datei erstellen, die eine zip datei in ein bestimmtes verzeichnis entpackt. und richtig gut wäre es wenn diese datei auch von nem ftp geladen werden könnte. wär nett wenn ihr mir die befehle ein wenig erklären könntet, denn ich kenn mich mit coding nicht aus.
das ganze ist dafür mir das manuelle updaten meines antivirus-programmes zu ersparen.

wenn jemand ne idee hat wäre ich echt sehr dankbar.

thx
 
ok das mit dem entpacken hab ich nun rausgekriegt.
mittels dem hier klappts:
c:/programme/winrar/winrar.exe x c:/test *.* c:/test/test1/test2/.

wär aber nett wenn jemand ne idee hat wie das mit dem files vom ftp laden geht.

thx
 
Bei WinXP ist ein ftp-Client dabei. Gib einfach mal ftp -? ein, und schau dir die diversen Optionen an.
Für alle die es noch interesiert ;)

eine alternative zum windows ftp wäre eine applikation namens wget.
Diese downloaded dir das zeug dann auch ,
allerings in meinen augen etwas leichter,
ist ansichtssache ..
du brauchst aber auf jedemfall eine/n feste/n dateinamen + Url um dieses vorhaben zu verwirklichen denke ich doch ;)


Wget(gnuwin32.sourceforge.net)


Wget Download (Privat gehostet)

Code:
> wget --help
GNU Wget 1.9.1, a non-interactive network retriever.
Usage: wget [OPTION]... [URL]...

Mandatory arguments to long options are mandatory for short options too.

Startup:
  -V,  --version           display the version of Wget and exit.
  -h,  --help              print this help.
  -b,  --background        go to background after startup.
  -e,  --execute=COMMAND   execute a `.wgetrc'-style command.

Logging and input file:
  -o,  --output-file=FILE     log messages to FILE.
  -a,  --append-output=FILE   append messages to FILE.
  -d,  --debug                print debug output.
  -q,  --quiet                quiet (no output).
  -v,  --verbose              be verbose (this is the default).
  -nv, --non-verbose          turn off verboseness, without being quiet.
  -i,  --input-file=FILE      download URLs found in FILE.
  -F,  --force-html           treat input file as HTML.
  -B,  --base=URL             prepends URL to relative links in -F -i file.

Download:
  -t,  --tries=NUMBER           set number of retries to NUMBER (0 unlimits).
       --retry-connrefused      retry even if connection is refused.
  -O   --output-document=FILE   write documents to FILE.
  -nc, --no-clobber             don't clobber existing files or use .# suffixes.

  -c,  --continue               resume getting a partially-downloaded file.
       --progress=TYPE          select progress gauge type.
  -N,  --timestamping           don't re-retrieve files unless newer than local.

  -S,  --server-response        print server response.
       --spider                 don't download anything.
  -T,  --timeout=SECONDS        set all timeout values to SECONDS.
       --dns-timeout=SECS       set the DNS lookup timeout to SECS.
       --connect-timeout=SECS   set the connect timeout to SECS.
       --read-timeout=SECS      set the read timeout to SECS.
  -w,  --wait=SECONDS           wait SECONDS between retrievals.
       --waitretry=SECONDS      wait 1...SECONDS between retries of a retrieval.

       --random-wait            wait from 0...2*WAIT secs between retrievals.
  -Y,  --proxy=on/off           turn proxy on or off.
  -Q,  --quota=NUMBER           set retrieval quota to NUMBER.
       --bind-address=ADDRESS   bind to ADDRESS (hostname or IP) on local host.
       --limit-rate=RATE        limit download rate to RATE.
       --dns-cache=off          disable caching DNS lookups.
       --restrict-file-names=OS restrict chars in file names to ones OS allows.

Directories:
  -nd, --no-directories            don't create directories.
  -x,  --force-directories         force creation of directories.
  -nH, --no-host-directories       don't create host directories.
  -P,  --directory-prefix=PREFIX   save files to PREFIX/...
       --cut-dirs=NUMBER           ignore NUMBER remote directory components.

HTTP options:
       --http-user=USER      set http user to USER.
       --http-passwd=PASS    set http password to PASS.
  -C,  --cache=on/off        (dis)allow server-cached data (normally allowed).
  -E,  --html-extension      save all text/html documents with .html extension.
       --ignore-length       ignore `Content-Length' header field.
       --header=STRING       insert STRING among the headers.
       --proxy-user=USER     set USER as proxy username.
       --proxy-passwd=PASS   set PASS as proxy password.
       --referer=URL         include `Referer: URL' header in HTTP request.
  -s,  --save-headers        save the HTTP headers to file.
  -U,  --user-agent=AGENT    identify as AGENT instead of Wget/VERSION.
       --no-http-keep-alive  disable HTTP keep-alive (persistent connections).
       --cookies=off         don't use cookies.
       --load-cookies=FILE   load cookies from FILE before session.
       --save-cookies=FILE   save cookies to FILE after session.
       --post-data=STRING    use the POST method; send STRING as the data.
       --post-file=FILE      use the POST method; send contents of FILE.

HTTPS (SSL) options:
       --sslcertfile=FILE     optional client certificate.
       --sslcertkey=KEYFILE   optional keyfile for this certificate.
       --egd-file=FILE        file name of the EGD socket.
       --sslcadir=DIR         dir where hash list of CA's are stored.
       --sslcafile=FILE       file with bundle of CA's
       --sslcerttype=0/1      Client-Cert type 0=PEM (default) / 1=ASN1 (DER)
       --sslcheckcert=0/1     Check the server cert agenst given CA
       --sslprotocol=0-3      choose SSL protocol; 0=automatic,
                              1=SSLv2 2=SSLv3 3=TLSv1

FTP options:
  -nr, --dont-remove-listing   don't remove `.listing' files.
  -g,  --glob=on/off           turn file name globbing on or off.
       --passive-ftp           use the "passive" transfer mode.
       --retr-symlinks         when recursing, get linked-to files (not dirs).

Recursive retrieval:
  -r,  --recursive          recursive download.
  -l,  --level=NUMBER       maximum recursion depth (inf or 0 for infinite).
       --delete-after       delete files locally after downloading them.
  -k,  --convert-links      convert non-relative links to relative.
  -K,  --backup-converted   before converting file X, back up as X.orig.
  -m,  --mirror             shortcut option equivalent to -r -N -l inf -nr.
  -p,  --page-requisites    get all images, etc. needed to display HTML page.
       --strict-comments    turn on strict (SGML) handling of HTML comments.

Recursive accept/reject:
  -A,  --accept=LIST                comma-separated list of accepted extensions.

  -R,  --reject=LIST                comma-separated list of rejected extensions.

  -D,  --domains=LIST               comma-separated list of accepted domains.
       --exclude-domains=LIST       comma-separated list of rejected domains.
       --follow-ftp                 follow FTP links from HTML documents.
       --follow-tags=LIST           comma-separated list of followed HTML tags.
  -G,  --ignore-tags=LIST           comma-separated list of ignored HTML tags.
  -H,  --span-hosts                 go to foreign hosts when recursive.
  -L,  --relative                   follow relative links only.
  -I,  --include-directories=LIST   list of allowed directories.
  -X,  --exclude-directories=LIST   list of excluded directories.
  -np, --no-parent                  don't ascend to the parent directory.

Mail bug reports and suggestions to <bug-wget@gnu.org>.
 

Neue Beiträge

Zurück