Merge branch 'hovudstraum' into idp

This commit is contained in:
ed 2024-03-12 17:31:27 +00:00
commit f193f398c1
43 changed files with 523 additions and 160 deletions

View file

@ -104,7 +104,7 @@ turn almost any device into a file server with resumable uploads/downloads using
* [sfx](#sfx) - the self-contained "binary" * [sfx](#sfx) - the self-contained "binary"
* [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+) * [copyparty.exe](#copypartyexe) - download [copyparty.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty.exe) (win8+) or [copyparty32.exe](https://github.com/9001/copyparty/releases/latest/download/copyparty32.exe) (win7+)
* [install on android](#install-on-android) * [install on android](#install-on-android)
* [reporting bugs](#reporting-bugs) - ideas for context to include in bug reports * [reporting bugs](#reporting-bugs) - ideas for context to include, and where to submit them
* [devnotes](#devnotes) - for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md) * [devnotes](#devnotes) - for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
@ -286,6 +286,9 @@ roughly sorted by chance of encounter
* cannot index non-ascii filenames with `-e2d` * cannot index non-ascii filenames with `-e2d`
* cannot handle filenames with mojibake * cannot handle filenames with mojibake
if you have a new exciting bug to share, see [reporting bugs](#reporting-bugs)
## not my bugs ## not my bugs
same order here too same order here too
@ -341,9 +344,18 @@ upgrade notes
* yes, using the [`g` permission](#accounts-and-volumes), see the examples there * yes, using the [`g` permission](#accounts-and-volumes), see the examples there
* you can also do this with linux filesystem permissions; `chmod 111 music` will make it possible to access files and folders inside the `music` folder but not list the immediate contents -- also works with other software, not just copyparty * you can also do this with linux filesystem permissions; `chmod 111 music` will make it possible to access files and folders inside the `music` folder but not list the immediate contents -- also works with other software, not just copyparty
* can I link someone to a password-protected volume/file by including the password in the URL?
* yes, by adding `?pw=hunter2` to the end; replace `?` with `&` if there are parameters in the URL already, meaning it contains a `?` near the end
* how do I stop `.hist` folders from appearing everywhere on my HDD?
* by default, a `.hist` folder is created inside each volume for the filesystem index, thumbnails, audio transcodes, and markdown document history. Use the `--hist` global-option or the `hist` volflag to move it somewhere else; see [database location](#database-location)
* can I make copyparty download a file to my server if I give it a URL? * can I make copyparty download a file to my server if I give it a URL?
* yes, using [hooks](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py) * yes, using [hooks](https://github.com/9001/copyparty/blob/hovudstraum/bin/hooks/wget.py)
* firefox refuses to connect over https, saying "Secure Connection Failed" or "SEC_ERROR_BAD_SIGNATURE", but the usual button to "Accept the Risk and Continue" is not shown
* firefox has corrupted its certstore; fix this by exiting firefox, then find and delete the file named `cert9.db` somewhere in your firefox profile folder
* i want to learn python and/or programming and am considering looking at the copyparty source code in that occasion * i want to learn python and/or programming and am considering looking at the copyparty source code in that occasion
* ```bash * ```bash
_| _ __ _ _|_ _| _ __ _ _|_
@ -1292,6 +1304,8 @@ the classname of the HTML tag is set according to the selected theme, which is u
see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where the color variables are set, and there's layout-specific stuff near the bottom see the top of [./copyparty/web/browser.css](./copyparty/web/browser.css) where the color variables are set, and there's layout-specific stuff near the bottom
if you want to change the fonts, see [./docs/rice/](./docs/rice/)
## complete examples ## complete examples
@ -1943,7 +1957,12 @@ if you want thumbnails (photos+videos) and you're okay with spending another 132
# reporting bugs # reporting bugs
ideas for context to include in bug reports ideas for context to include, and where to submit them
please get in touch using any of the following URLs:
* https://github.com/9001/copyparty/ **(primary)**
* https://gitlab.com/9001/copyparty/ *(mirror)*
* https://codeberg.org/9001/copyparty *(mirror)*
in general, commandline arguments (and config file if any) in general, commandline arguments (and config file if any)
@ -1958,3 +1977,6 @@ if there's a wall of base64 in the log (thread stacks) then please include that,
# devnotes # devnotes
for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md) for build instructions etc, see [./docs/devnotes.md](./docs/devnotes.md)
see [./docs/TODO.md](./docs/TODO.md) for planned features / fixes / changes

View file

@ -16,6 +16,8 @@
* sharex config file to upload screenshots and grab the URL * sharex config file to upload screenshots and grab the URL
* `RequestURL`: full URL to the target folder * `RequestURL`: full URL to the target folder
* `pw`: password (remove the `pw` line if anon-write) * `pw`: password (remove the `pw` line if anon-write)
* the `act:bput` thing is optional since copyparty v1.9.29
* using an older sharex version, maybe sharex v12.1.1 for example? dw fam i got your back 👉😎👉 [`sharex12.sxcu`](sharex12.sxcu)
### [`send-to-cpp.contextlet.json`](send-to-cpp.contextlet.json) ### [`send-to-cpp.contextlet.json`](send-to-cpp.contextlet.json)
* browser integration, kind of? custom rightclick actions and stuff * browser integration, kind of? custom rightclick actions and stuff

13
contrib/sharex12.sxcu Normal file
View file

@ -0,0 +1,13 @@
{
"Name": "copyparty",
"DestinationType": "ImageUploader, TextUploader, FileUploader",
"RequestURL": "http://127.0.0.1:3923/sharex",
"FileFormName": "f",
"Arguments": {
"act": "bput"
},
"Headers": {
"accept": "url",
"pw": "PUT_YOUR_PASSWORD_HERE_MY_DUDE"
}
}

View file

@ -395,7 +395,7 @@ def configure_ssl_ciphers(al: argparse.Namespace) -> None:
def args_from_cfg(cfg_path: str) -> list[str]: def args_from_cfg(cfg_path: str) -> list[str]:
lines: list[str] = [] lines: list[str] = []
expand_config_file(lines, cfg_path, "") expand_config_file(None, lines, cfg_path, "")
lines = upgrade_cfg_fmt(None, argparse.Namespace(vc=False), lines, "") lines = upgrade_cfg_fmt(None, argparse.Namespace(vc=False), lines, "")
ret: list[str] = [] ret: list[str] = []
@ -876,6 +876,7 @@ def add_upload(ap):
ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless \033[33m-ed\033[0m") ap2.add_argument("--dotpart", action="store_true", help="dotfile incomplete uploads, hiding them from clients unless \033[33m-ed\033[0m")
ap2.add_argument("--plain-ip", action="store_true", help="when avoiding filename collisions by appending the uploader's ip to the filename: append the plaintext ip instead of salting and hashing the ip") ap2.add_argument("--plain-ip", action="store_true", help="when avoiding filename collisions by appending the uploader's ip to the filename: append the plaintext ip instead of salting and hashing the ip")
ap2.add_argument("--unpost", metavar="SEC", type=int, default=3600*12, help="grace period where uploads can be deleted by the uploader, even without delete permissions; 0=disabled, default=12h") ap2.add_argument("--unpost", metavar="SEC", type=int, default=3600*12, help="grace period where uploads can be deleted by the uploader, even without delete permissions; 0=disabled, default=12h")
ap2.add_argument("--u2abort", metavar="NUM", type=int, default=1, help="clients can abort incomplete uploads by using the unpost tab (requires \033[33m-e2d\033[0m). [\033[32m0\033[0m] = never allowed (disable feature), [\033[32m1\033[0m] = allow if client has the same IP as the upload AND is using the same account, [\033[32m2\033[0m] = just check the IP, [\033[32m3\033[0m] = just check account-name (volflag=u2abort)")
ap2.add_argument("--blank-wt", metavar="SEC", type=int, default=300, help="file write grace period (any client can write to a blank file last-modified more recently than \033[33mSEC\033[0m seconds ago)") ap2.add_argument("--blank-wt", metavar="SEC", type=int, default=300, help="file write grace period (any client can write to a blank file last-modified more recently than \033[33mSEC\033[0m seconds ago)")
ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without \033[33m-e2d\033[0m; roughly 1 MiB RAM per 600") ap2.add_argument("--reg-cap", metavar="N", type=int, default=38400, help="max number of uploads to keep in memory when running without \033[33m-e2d\033[0m; roughly 1 MiB RAM per 600")
ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (bad idea to enable this on windows and/or cow filesystems)") ap2.add_argument("--no-fpool", action="store_true", help="disable file-handle pooling -- instead, repeatedly close and reopen files during upload (bad idea to enable this on windows and/or cow filesystems)")
@ -1274,6 +1275,7 @@ def add_ui(ap, retry):
ap2.add_argument("--bname", metavar="TXT", type=u, default="--name", help="server name (displayed in filebrowser document title)") ap2.add_argument("--bname", metavar="TXT", type=u, default="--name", help="server name (displayed in filebrowser document title)")
ap2.add_argument("--pb-url", metavar="URL", type=u, default="https://github.com/9001/copyparty", help="powered-by link; disable with \033[33m-np\033[0m") ap2.add_argument("--pb-url", metavar="URL", type=u, default="https://github.com/9001/copyparty", help="powered-by link; disable with \033[33m-np\033[0m")
ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible with \033[33m-nb\033[0m)") ap2.add_argument("--ver", action="store_true", help="show version on the control panel (incompatible with \033[33m-nb\033[0m)")
ap2.add_argument("--k304", metavar="NUM", type=int, default=0, help="configure the option to enable/disable k304 on the controlpanel (workaround for buggy reverse-proxies); [\033[32m0\033[0m] = hidden and default-off, [\033[32m1\033[0m] = visible and default-off, [\033[32m2\033[0m] = visible and default-on")
ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox") ap2.add_argument("--md-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for README.md docs (volflag=md_sbf); see https://developer.mozilla.org/en-US/docs/Web/HTML/Element/iframe#attr-sandbox")
ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for prologue/epilogue docs (volflag=lg_sbf)") ap2.add_argument("--lg-sbf", metavar="FLAGS", type=u, default="downloads forms popups scripts top-navigation-by-user-activation", help="list of capabilities to ALLOW for prologue/epilogue docs (volflag=lg_sbf)")
ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README.md documents (volflags: no_sb_md | sb_md)") ap2.add_argument("--no-sb-md", action="store_true", help="don't sandbox README.md documents (volflags: no_sb_md | sb_md)")

View file

@ -991,7 +991,7 @@ class AuthSrv(object):
) -> None: ) -> None:
self.line_ctr = 0 self.line_ctr = 0
expand_config_file(cfg_lines, fp, "") expand_config_file(self.log, cfg_lines, fp, "")
if self.args.vc: if self.args.vc:
lns = ["{:4}: {}".format(n, s) for n, s in enumerate(cfg_lines, 1)] lns = ["{:4}: {}".format(n, s) for n, s in enumerate(cfg_lines, 1)]
self.log("expanded config file (unprocessed):\n" + "\n".join(lns)) self.log("expanded config file (unprocessed):\n" + "\n".join(lns))
@ -1745,7 +1745,7 @@ class AuthSrv(object):
if k not in vol.flags: if k not in vol.flags:
vol.flags[k] = getattr(self.args, k) vol.flags[k] = getattr(self.args, k)
for k in ("nrand",): for k in ("nrand", "u2abort"):
if k in vol.flags: if k in vol.flags:
vol.flags[k] = int(vol.flags[k]) vol.flags[k] = int(vol.flags[k])
@ -2367,27 +2367,50 @@ def split_cfg_ln(ln: str) -> dict[str, Any]:
return ret return ret
def expand_config_file(ret: list[str], fp: str, ipath: str) -> None: def expand_config_file(
log: Optional["NamedLogger"], ret: list[str], fp: str, ipath: str
) -> None:
"""expand all % file includes""" """expand all % file includes"""
fp = absreal(fp) fp = absreal(fp)
if len(ipath.split(" -> ")) > 64: if len(ipath.split(" -> ")) > 64:
raise Exception("hit max depth of 64 includes") raise Exception("hit max depth of 64 includes")
if os.path.isdir(fp): if os.path.isdir(fp):
names = os.listdir(fp) names = list(sorted(os.listdir(fp)))
crumb = "#\033[36m cfg files in {} => {}\033[0m".format(fp, names) cnames = [x for x in names if x.lower().endswith(".conf")]
ret.append(crumb) if not cnames:
for fn in sorted(names): t = "warning: tried to read config-files from folder '%s' but it does not contain any "
if names:
t += ".conf files; the following files were ignored: %s"
t = t % (fp, ", ".join(names[:8]))
else:
t += "files at all"
t = t % (fp,)
if log:
log(t, 3)
ret.append("#\033[33m %s\033[0m" % (t,))
else:
zs = "#\033[36m cfg files in %s => %s\033[0m" % (fp, cnames)
ret.append(zs)
for fn in cnames:
fp2 = os.path.join(fp, fn) fp2 = os.path.join(fp, fn)
if not fp2.endswith(".conf") or fp2 in ipath: if fp2 in ipath:
continue continue
expand_config_file(ret, fp2, ipath) expand_config_file(log, ret, fp2, ipath)
if ret[-1] == crumb: return
# no config files below; remove breadcrumb
ret.pop()
if not os.path.exists(fp):
t = "warning: tried to read config from '%s' but the file/folder does not exist"
t = t % (fp,)
if log:
log(t, 3)
ret.append("#\033[31m %s\033[0m" % (t,))
return return
ipath += " -> " + fp ipath += " -> " + fp
@ -2401,7 +2424,7 @@ def expand_config_file(ret: list[str], fp: str, ipath: str) -> None:
fp2 = ln[1:].strip() fp2 = ln[1:].strip()
fp2 = os.path.join(os.path.dirname(fp), fp2) fp2 = os.path.join(os.path.dirname(fp), fp2)
ofs = len(ret) ofs = len(ret)
expand_config_file(ret, fp2, ipath) expand_config_file(log, ret, fp2, ipath)
for n in range(ofs, len(ret)): for n in range(ofs, len(ret)):
ret[n] = pad + ret[n] ret[n] = pad + ret[n]
continue continue

View file

@ -66,6 +66,7 @@ def vf_vmap() -> dict[str, str]:
"rm_retry", "rm_retry",
"sort", "sort",
"unlist", "unlist",
"u2abort",
"u2ts", "u2ts",
): ):
ret[k] = k ret[k] = k
@ -116,6 +117,7 @@ flagcats = {
"hardlink": "does dedup with hardlinks instead of symlinks", "hardlink": "does dedup with hardlinks instead of symlinks",
"neversymlink": "disables symlink fallback; full copy instead", "neversymlink": "disables symlink fallback; full copy instead",
"copydupes": "disables dedup, always saves full copies of dupes", "copydupes": "disables dedup, always saves full copies of dupes",
"sparse": "force use of sparse files, mainly for s3-backed storage",
"daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files", "daw": "enable full WebDAV write support (dangerous);\nPUT-operations will now \033[1;31mOVERWRITE\033[0;35m existing files",
"nosub": "forces all uploads into the top folder of the vfs", "nosub": "forces all uploads into the top folder of the vfs",
"magic": "enables filetype detection for nameless uploads", "magic": "enables filetype detection for nameless uploads",
@ -130,6 +132,7 @@ flagcats = {
"rand": "force randomized filenames, 9 chars long by default", "rand": "force randomized filenames, 9 chars long by default",
"nrand=N": "randomized filenames are N chars long", "nrand=N": "randomized filenames are N chars long",
"u2ts=fc": "[f]orce [c]lient-last-modified or [u]pload-time", "u2ts=fc": "[f]orce [c]lient-last-modified or [u]pload-time",
"u2abort=1": "allow aborting unfinished uploads? 0=no 1=strict 2=ip-chk 3=acct-chk",
"sz=1k-3m": "allow filesizes between 1 KiB and 3MiB", "sz=1k-3m": "allow filesizes between 1 KiB and 3MiB",
"df=1g": "ensure 1 GiB free disk space", "df=1g": "ensure 1 GiB free disk space",
}, },

View file

@ -300,7 +300,7 @@ class FtpFs(AbstractedFS):
vp = join(self.cwd, path).lstrip("/") vp = join(self.cwd, path).lstrip("/")
try: try:
self.hub.up2k.handle_rm(self.uname, self.h.cli_ip, [vp], [], False) self.hub.up2k.handle_rm(self.uname, self.h.cli_ip, [vp], [], False, False)
except Exception as ex: except Exception as ex:
raise FSE(str(ex)) raise FSE(str(ex))

View file

@ -319,7 +319,9 @@ class HttpCli(object):
if self.args.xff_re and not self.args.xff_re.match(pip): if self.args.xff_re and not self.args.xff_re.match(pip):
t = 'got header "%s" from untrusted source "%s" claiming the true client ip is "%s" (raw value: "%s"); if you trust this, you must allowlist this proxy with "--xff-src=%s"' t = 'got header "%s" from untrusted source "%s" claiming the true client ip is "%s" (raw value: "%s"); if you trust this, you must allowlist this proxy with "--xff-src=%s"'
if self.headers.get("cf-connecting-ip"): if self.headers.get("cf-connecting-ip"):
t += " Alternatively, if you are behind cloudflare, it is better to specify these two instead: --xff-hdr=cf-connecting-ip --xff-src=any" t += ' Note: if you are behind cloudflare, then this default header is not a good choice; please first make sure your local reverse-proxy (if any) does not allow non-cloudflare IPs from providing cf-* headers, and then add this additional global setting: "--xff-hdr=cf-connecting-ip"'
else:
t += ' Note: depending on your reverse-proxy, and/or WAF, and/or other intermediates, you may want to read the true client IP from another header by also specifying "--xff-hdr=SomeOtherHeader"'
zs = ( zs = (
".".join(pip.split(".")[:2]) + "." ".".join(pip.split(".")[:2]) + "."
if "." in pip if "." in pip
@ -529,9 +531,13 @@ class HttpCli(object):
return self.handle_options() and self.keepalive return self.handle_options() and self.keepalive
if not cors_k: if not cors_k:
host = self.headers.get("host", "<?>")
origin = self.headers.get("origin", "<?>") origin = self.headers.get("origin", "<?>")
self.log("cors-reject {} from {}".format(self.mode, origin), 3) proto = "https://" if self.is_https else "http://"
raise Pebkac(403, "no surfing") guess = "modifying" if (origin and host) else "stripping"
t = "cors-reject %s because request-header Origin='%s' does not match request-protocol '%s' and host '%s' based on request-header Host='%s' (note: if this request is not malicious, check if your reverse-proxy is accidentally %s request headers, in particular 'Origin', for example by running copyparty with --ihead='*' to show all request headers)"
self.log(t % (self.mode, origin, proto, self.host, host, guess), 3)
raise Pebkac(403, "rejected by cors-check")
# getattr(self.mode) is not yet faster than this # getattr(self.mode) is not yet faster than this
if self.mode == "POST": if self.mode == "POST":
@ -662,7 +668,11 @@ class HttpCli(object):
def k304(self) -> bool: def k304(self) -> bool:
k304 = self.cookies.get("k304") k304 = self.cookies.get("k304")
return k304 == "y" or ("; Trident/" in self.ua and not k304) return (
k304 == "y"
or (self.args.k304 == 2 and k304 != "n")
or ("; Trident/" in self.ua and not k304)
)
def send_headers( def send_headers(
self, self,
@ -2838,11 +2848,11 @@ class HttpCli(object):
logtail = "" logtail = ""
# #
# if request is for foo.js, check if we have foo.js.{gz,br} # if request is for foo.js, check if we have foo.js.gz
file_ts = 0.0 file_ts = 0.0
editions: dict[str, tuple[str, int]] = {} editions: dict[str, tuple[str, int]] = {}
for ext in ["", ".gz", ".br"]: for ext in ("", ".gz"):
try: try:
fs_path = req_path + ext fs_path = req_path + ext
st = bos.stat(fs_path) st = bos.stat(fs_path)
@ -2887,12 +2897,7 @@ class HttpCli(object):
x.strip() x.strip()
for x in self.headers.get("accept-encoding", "").lower().split(",") for x in self.headers.get("accept-encoding", "").lower().split(",")
] ]
if ".br" in editions and "br" in supported_editions: if ".gz" in editions:
is_compressed = True
selected_edition = ".br"
fs_path, file_sz = editions[".br"]
self.out_headers["Content-Encoding"] = "br"
elif ".gz" in editions:
is_compressed = True is_compressed = True
selected_edition = ".gz" selected_edition = ".gz"
fs_path, file_sz = editions[".gz"] fs_path, file_sz = editions[".gz"]
@ -2908,13 +2913,8 @@ class HttpCli(object):
is_compressed = False is_compressed = False
selected_edition = "plain" selected_edition = "plain"
try: fs_path, file_sz = editions[selected_edition]
fs_path, file_sz = editions[selected_edition] logmsg += "{} ".format(selected_edition.lstrip("."))
logmsg += "{} ".format(selected_edition.lstrip("."))
except:
# client is old and we only have .br
# (could make brotli a dep to fix this but it's not worth)
raise Pebkac(404)
# #
# partial # partial
@ -3369,6 +3369,7 @@ class HttpCli(object):
dbwt=vs["dbwt"], dbwt=vs["dbwt"],
url_suf=suf, url_suf=suf,
k304=self.k304(), k304=self.k304(),
k304vis=self.args.k304 > 0,
ver=S_VERSION if self.args.ver else "", ver=S_VERSION if self.args.ver else "",
ahttps="" if self.is_https else "https://" + self.host + self.req, ahttps="" if self.is_https else "https://" + self.host + self.req,
) )
@ -3377,7 +3378,7 @@ class HttpCli(object):
def set_k304(self) -> bool: def set_k304(self) -> bool:
v = self.uparam["k304"].lower() v = self.uparam["k304"].lower()
if v == "y": if v in "yn":
dur = 86400 * 299 dur = 86400 * 299
else: else:
dur = 0 dur = 0
@ -3560,8 +3561,7 @@ class HttpCli(object):
return ret return ret
def tx_ups(self) -> bool: def tx_ups(self) -> bool:
if not self.args.unpost: have_unpost = self.args.unpost and "e2d" in self.vn.flags
raise Pebkac(403, "the unpost feature is disabled in server config")
idx = self.conn.get_u2idx() idx = self.conn.get_u2idx()
if not idx or not hasattr(idx, "p_end"): if not idx or not hasattr(idx, "p_end"):
@ -3580,7 +3580,14 @@ class HttpCli(object):
if "fk" in vol.flags if "fk" in vol.flags
and (self.uname in vol.axs.uread or self.uname in vol.axs.upget) and (self.uname in vol.axs.uread or self.uname in vol.axs.upget)
} }
for vol in self.asrv.vfs.all_vols.values():
x = self.conn.hsrv.broker.ask(
"up2k.get_unfinished_by_user", self.uname, self.ip
)
uret = x.get()
allvols = self.asrv.vfs.all_vols if have_unpost else {}
for vol in allvols.values():
cur = idx.get_cur(vol.realpath) cur = idx.get_cur(vol.realpath)
if not cur: if not cur:
continue continue
@ -3632,9 +3639,13 @@ class HttpCli(object):
for v in ret: for v in ret:
v["vp"] = self.args.SR + v["vp"] v["vp"] = self.args.SR + v["vp"]
jtxt = json.dumps(ret, indent=2, sort_keys=True).encode("utf-8", "replace") if not have_unpost:
self.log("{} #{} {:.2f}sec".format(lm, len(ret), time.time() - t0)) ret = [{"kinshi": 1}]
self.reply(jtxt, mime="application/json")
jtxt = '{"u":%s,"c":%s}' % (uret, json.dumps(ret, indent=0))
zi = len(uret.split('\n"pd":')) - 1
self.log("%s #%d+%d %.2fsec" % (lm, zi, len(ret), time.time() - t0))
self.reply(jtxt.encode("utf-8", "replace"), mime="application/json")
return True return True
def handle_rm(self, req: list[str]) -> bool: def handle_rm(self, req: list[str]) -> bool:
@ -3649,11 +3660,12 @@ class HttpCli(object):
elif self.is_vproxied: elif self.is_vproxied:
req = [x[len(self.args.SR) :] for x in req] req = [x[len(self.args.SR) :] for x in req]
unpost = "unpost" in self.uparam
nlim = int(self.uparam.get("lim") or 0) nlim = int(self.uparam.get("lim") or 0)
lim = [nlim, nlim] if nlim else [] lim = [nlim, nlim] if nlim else []
x = self.conn.hsrv.broker.ask( x = self.conn.hsrv.broker.ask(
"up2k.handle_rm", self.uname, self.ip, req, lim, False "up2k.handle_rm", self.uname, self.ip, req, lim, False, unpost
) )
self.loud_reply(x.get()) self.loud_reply(x.get())
return True return True

View file

@ -191,7 +191,7 @@ class HttpSrv(object):
for fn in df: for fn in df:
ap = absreal(os.path.join(dp, fn)) ap = absreal(os.path.join(dp, fn))
self.statics.add(ap) self.statics.add(ap)
if ap.endswith(".gz") or ap.endswith(".br"): if ap.endswith(".gz"):
self.statics.add(ap[:-3]) self.statics.add(ap[:-3])
def set_netdevs(self, netdevs: dict[str, Netdev]) -> None: def set_netdevs(self, netdevs: dict[str, Netdev]) -> None:

View file

@ -206,6 +206,9 @@ class Metrics(object):
try: try:
x = self.hsrv.broker.ask("up2k.get_unfinished") x = self.hsrv.broker.ask("up2k.get_unfinished")
xs = x.get() xs = x.get()
if not xs:
raise Exception("up2k mutex acquisition timed out")
xj = json.loads(xs) xj = json.loads(xs)
for ptop, (nbytes, nfiles) in xj.items(): for ptop, (nbytes, nfiles) in xj.items():
tnbytes += nbytes tnbytes += nbytes

View file

@ -340,7 +340,7 @@ class SMB(object):
yeet("blocked delete (no-del-acc): " + vpath) yeet("blocked delete (no-del-acc): " + vpath)
vpath = vpath.replace("\\", "/").lstrip("/") vpath = vpath.replace("\\", "/").lstrip("/")
self.hub.up2k.handle_rm(uname, "1.7.6.2", [vpath], [], False) self.hub.up2k.handle_rm(uname, "1.7.6.2", [vpath], [], False, False)
def _utime(self, vpath: str, times: tuple[float, float]) -> None: def _utime(self, vpath: str, times: tuple[float, float]) -> None:
if not self.args.smbw: if not self.args.smbw:

View file

@ -28,7 +28,7 @@ if True: # pylint: disable=using-constant-test
import typing import typing
from typing import Any, Optional, Union from typing import Any, Optional, Union
from .__init__ import ANYWIN, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode from .__init__ import ANYWIN, E, EXE, MACOS, TYPE_CHECKING, EnvParams, unicode
from .authsrv import BAD_CFG, AuthSrv from .authsrv import BAD_CFG, AuthSrv
from .cert import ensure_cert from .cert import ensure_cert
from .mtag import HAVE_FFMPEG, HAVE_FFPROBE from .mtag import HAVE_FFMPEG, HAVE_FFPROBE
@ -94,7 +94,7 @@ class SvcHub(object):
self.stopping = False self.stopping = False
self.stopped = False self.stopped = False
self.reload_req = False self.reload_req = False
self.reloading = False self.reloading = 0
self.stop_cond = threading.Condition() self.stop_cond = threading.Condition()
self.nsigs = 3 self.nsigs = 3
self.retcode = 0 self.retcode = 0
@ -154,6 +154,8 @@ class SvcHub(object):
lg.handlers = [lh] lg.handlers = [lh]
lg.setLevel(logging.DEBUG) lg.setLevel(logging.DEBUG)
self._check_env()
if args.stackmon: if args.stackmon:
start_stackmon(args.stackmon, 0) start_stackmon(args.stackmon, 0)
@ -385,6 +387,17 @@ class SvcHub(object):
Daemon(self.sd_notify, "sd-notify") Daemon(self.sd_notify, "sd-notify")
def _check_env(self) -> None:
try:
files = os.listdir(E.cfg)
except:
files = []
hits = [x for x in files if x.lower().endswith(".conf")]
if hits:
t = "WARNING: found config files in [%s]: %s\n config files are not expected here, and will NOT be loaded (unless your setup is intentionally hella funky)"
self.log("root", t % (E.cfg, ", ".join(hits)), 3)
def _process_config(self) -> bool: def _process_config(self) -> bool:
al = self.args al = self.args
@ -674,23 +687,24 @@ class SvcHub(object):
self.log("root", "ssdp startup failed;\n" + min_ex(), 3) self.log("root", "ssdp startup failed;\n" + min_ex(), 3)
def reload(self) -> str: def reload(self) -> str:
if self.reloading: with self.up2k.mutex:
return "cannot reload; already in progress" if self.reloading:
return "cannot reload; already in progress"
self.reloading = 1
self.reloading = True
Daemon(self._reload, "reloading") Daemon(self._reload, "reloading")
return "reload initiated" return "reload initiated"
def _reload(self, rescan_all_vols: bool = True) -> None: def _reload(self, rescan_all_vols: bool = True) -> None:
self.reloading = True
self.log("root", "reload scheduled")
with self.up2k.mutex: with self.up2k.mutex:
self.reloading = True if self.reloading != 1:
return
self.reloading = 2
self.log("root", "reloading config")
self.asrv.reload() self.asrv.reload()
self.up2k.reload(rescan_all_vols) self.up2k.reload(rescan_all_vols)
self.broker.reload() self.broker.reload()
self.reloading = 0
self.reloading = False
def stop_thr(self) -> None: def stop_thr(self) -> None:
while not self.stop_req: while not self.stop_req:

View file

@ -360,7 +360,7 @@ class Tftpd(object):
yeet("attempted delete of non-empty file") yeet("attempted delete of non-empty file")
vpath = vpath.replace("\\", "/").lstrip("/") vpath = vpath.replace("\\", "/").lstrip("/")
self.hub.up2k.handle_rm("*", "8.3.8.7", [vpath], [], False) self.hub.up2k.handle_rm("*", "8.3.8.7", [vpath], [], False, False)
def _access(self, *a: Any) -> bool: def _access(self, *a: Any) -> bool:
return True return True

View file

@ -200,15 +200,15 @@ class Up2k(object):
Daemon(self.deferred_init, "up2k-deferred-init") Daemon(self.deferred_init, "up2k-deferred-init")
def reload(self, rescan_all_vols: bool) -> None: def reload(self, rescan_all_vols: bool) -> None:
self.gid += 1 """mutex me"""
self.log("reload #{} initiated".format(self.gid)) self.log("reload #{} scheduled".format(self.gid + 1))
all_vols = self.asrv.vfs.all_vols all_vols = self.asrv.vfs.all_vols
scan_vols = [k for k, v in all_vols.items() if v.realpath not in self.registry] scan_vols = [k for k, v in all_vols.items() if v.realpath not in self.registry]
if rescan_all_vols: if rescan_all_vols:
scan_vols = list(all_vols.keys()) scan_vols = list(all_vols.keys())
self.rescan(all_vols, scan_vols, True, False) self._rescan(all_vols, scan_vols, True, False)
def deferred_init(self) -> None: def deferred_init(self) -> None:
all_vols = self.asrv.vfs.all_vols all_vols = self.asrv.vfs.all_vols
@ -237,7 +237,7 @@ class Up2k(object):
for n in range(max(1, self.args.mtag_mt)): for n in range(max(1, self.args.mtag_mt)):
Daemon(self._tagger, "tagger-{}".format(n)) Daemon(self._tagger, "tagger-{}".format(n))
Daemon(self._run_all_mtp, "up2k-mtp-init") Daemon(self._run_all_mtp, "up2k-mtp-init", (self.gid,))
def log(self, msg: str, c: Union[int, str] = 0) -> None: def log(self, msg: str, c: Union[int, str] = 0) -> None:
if self.pp: if self.pp:
@ -287,9 +287,48 @@ class Up2k(object):
} }
return json.dumps(ret, indent=4) return json.dumps(ret, indent=4)
def get_unfinished_by_user(self, uname, ip) -> str:
if PY2 or not self.mutex.acquire(timeout=2):
return '[{"timeout":1}]'
ret: list[tuple[int, str, int, int, int]] = []
try:
for ptop, tab2 in self.registry.items():
cfg = self.flags.get(ptop, {}).get("u2abort", 1)
if not cfg:
continue
addr = (ip or "\n") if cfg in (1, 2) else ""
user = (uname or "\n") if cfg in (1, 3) else ""
drp = self.droppable.get(ptop, {})
for wark, job in tab2.items():
if (
wark in drp
or (user and user != job["user"])
or (addr and addr != job["addr"])
):
continue
zt5 = (
int(job["t0"]),
djoin(job["vtop"], job["prel"], job["name"]),
job["size"],
len(job["need"]),
len(job["hash"]),
)
ret.append(zt5)
finally:
self.mutex.release()
ret.sort(reverse=True)
ret2 = [
{"at": at, "vp": "/" + vp, "pd": 100 - ((nn * 100) // (nh or 1)), "sz": sz}
for (at, vp, sz, nn, nh) in ret
]
return json.dumps(ret2, indent=0)
def get_unfinished(self) -> str: def get_unfinished(self) -> str:
if PY2 or not self.mutex.acquire(timeout=0.5): if PY2 or not self.mutex.acquire(timeout=0.5):
return "{}" return ""
ret: dict[str, tuple[int, int]] = {} ret: dict[str, tuple[int, int]] = {}
try: try:
@ -342,14 +381,21 @@ class Up2k(object):
def rescan( def rescan(
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
) -> str: ) -> str:
with self.mutex:
return self._rescan(all_vols, scan_vols, wait, fscan)
def _rescan(
self, all_vols: dict[str, VFS], scan_vols: list[str], wait: bool, fscan: bool
) -> str:
"""mutex me"""
if not wait and self.pp: if not wait and self.pp:
return "cannot initiate; scan is already in progress" return "cannot initiate; scan is already in progress"
args = (all_vols, scan_vols, fscan) self.gid += 1
Daemon( Daemon(
self.init_indexes, self.init_indexes,
"up2k-rescan-{}".format(scan_vols[0] if scan_vols else "all"), "up2k-rescan-{}".format(scan_vols[0] if scan_vols else "all"),
args, (all_vols, scan_vols, fscan, self.gid),
) )
return "" return ""
@ -461,7 +507,7 @@ class Up2k(object):
if vp: if vp:
fvp = "%s/%s" % (vp, fvp) fvp = "%s/%s" % (vp, fvp)
self._handle_rm(LEELOO_DALLAS, "", fvp, [], True) self._handle_rm(LEELOO_DALLAS, "", fvp, [], True, False)
nrm += 1 nrm += 1
if nrm: if nrm:
@ -580,19 +626,32 @@ class Up2k(object):
return True, ret return True, ret
def init_indexes( def init_indexes(
self, all_vols: dict[str, VFS], scan_vols: list[str], fscan: bool self, all_vols: dict[str, VFS], scan_vols: list[str], fscan: bool, gid: int = 0
) -> bool: ) -> bool:
gid = self.gid if not gid:
while self.pp and gid == self.gid: with self.mutex:
time.sleep(0.1) gid = self.gid
if gid != self.gid: nspin = 0
return False while True:
nspin += 1
if nspin > 1:
time.sleep(0.1)
with self.mutex:
if gid != self.gid:
return False
if self.pp:
continue
self.pp = ProgressPrinter(self.log, self.args)
break
if gid: if gid:
self.log("reload #{} running".format(self.gid)) self.log("reload #%d running" % (gid,))
self.pp = ProgressPrinter(self.log, self.args)
vols = list(all_vols.values()) vols = list(all_vols.values())
t0 = time.time() t0 = time.time()
have_e2d = False have_e2d = False
@ -780,7 +839,7 @@ class Up2k(object):
if self.mtag: if self.mtag:
t = "online (running mtp)" t = "online (running mtp)"
if scan_vols: if scan_vols:
thr = Daemon(self._run_all_mtp, "up2k-mtp-scan", r=False) thr = Daemon(self._run_all_mtp, "up2k-mtp-scan", (gid,), r=False)
else: else:
self.pp = None self.pp = None
t = "online, idle" t = "online, idle"
@ -1814,8 +1873,7 @@ class Up2k(object):
self.pending_tags = [] self.pending_tags = []
return ret return ret
def _run_all_mtp(self) -> None: def _run_all_mtp(self, gid: int) -> None:
gid = self.gid
t0 = time.time() t0 = time.time()
for ptop, flags in self.flags.items(): for ptop, flags in self.flags.items():
if "mtp" in flags: if "mtp" in flags:
@ -2676,6 +2734,9 @@ class Up2k(object):
a = [job[x] for x in zs.split()] a = [job[x] for x in zs.split()]
self.db_add(cur, vfs.flags, *a) self.db_add(cur, vfs.flags, *a)
cur.connection.commit() cur.connection.commit()
elif wark in reg:
# checks out, but client may have hopped IPs
job["addr"] = cj["addr"]
if not job: if not job:
ap1 = djoin(cj["ptop"], cj["prel"]) ap1 = djoin(cj["ptop"], cj["prel"])
@ -3212,7 +3273,13 @@ class Up2k(object):
pass pass
def handle_rm( def handle_rm(
self, uname: str, ip: str, vpaths: list[str], lim: list[int], rm_up: bool self,
uname: str,
ip: str,
vpaths: list[str],
lim: list[int],
rm_up: bool,
unpost: bool,
) -> str: ) -> str:
n_files = 0 n_files = 0
ok = {} ok = {}
@ -3222,7 +3289,7 @@ class Up2k(object):
self.log("hit delete limit of {} files".format(lim[1]), 3) self.log("hit delete limit of {} files".format(lim[1]), 3)
break break
a, b, c = self._handle_rm(uname, ip, vp, lim, rm_up) a, b, c = self._handle_rm(uname, ip, vp, lim, rm_up, unpost)
n_files += a n_files += a
for k in b: for k in b:
ok[k] = 1 ok[k] = 1
@ -3236,25 +3303,43 @@ class Up2k(object):
return "deleted {} files (and {}/{} folders)".format(n_files, iok, iok + ing) return "deleted {} files (and {}/{} folders)".format(n_files, iok, iok + ing)
def _handle_rm( def _handle_rm(
self, uname: str, ip: str, vpath: str, lim: list[int], rm_up: bool self, uname: str, ip: str, vpath: str, lim: list[int], rm_up: bool, unpost: bool
) -> tuple[int, list[str], list[str]]: ) -> tuple[int, list[str], list[str]]:
self.db_act = time.time() self.db_act = time.time()
try: partial = ""
if not unpost:
permsets = [[True, False, False, True]] permsets = [[True, False, False, True]]
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0]) vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
vn, rem = vn.get_dbv(rem) vn, rem = vn.get_dbv(rem)
unpost = False else:
except:
# unpost with missing permissions? verify with db # unpost with missing permissions? verify with db
if not self.args.unpost:
raise Pebkac(400, "the unpost feature is disabled in server config")
unpost = True
permsets = [[False, True]] permsets = [[False, True]]
vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0]) vn, rem = self.asrv.vfs.get(vpath, uname, *permsets[0])
vn, rem = vn.get_dbv(rem) vn, rem = vn.get_dbv(rem)
ptop = vn.realpath
with self.mutex: with self.mutex:
_, _, _, _, dip, dat = self._find_from_vpath(vn.realpath, rem) abrt_cfg = self.flags.get(ptop, {}).get("u2abort", 1)
addr = (ip or "\n") if abrt_cfg in (1, 2) else ""
user = (uname or "\n") if abrt_cfg in (1, 3) else ""
reg = self.registry.get(ptop, {}) if abrt_cfg else {}
for wark, job in reg.items():
if (user and user != job["user"]) or (addr and addr != job["addr"]):
continue
if djoin(job["prel"], job["name"]) == rem:
if job["ptop"] != ptop:
t = "job.ptop [%s] != vol.ptop [%s] ??"
raise Exception(t % (job["ptop"] != ptop))
partial = vn.canonical(vjoin(job["prel"], job["tnam"]))
break
if partial:
dip = ip
dat = time.time()
else:
if not self.args.unpost:
t = "the unpost feature is disabled in server config"
raise Pebkac(400, t)
_, _, _, _, dip, dat = self._find_from_vpath(ptop, rem)
t = "you cannot delete this: " t = "you cannot delete this: "
if not dip: if not dip:
@ -3347,6 +3432,9 @@ class Up2k(object):
cur.connection.commit() cur.connection.commit()
wunlink(self.log, abspath, dbv.flags) wunlink(self.log, abspath, dbv.flags)
if partial:
wunlink(self.log, partial, dbv.flags)
partial = ""
if xad: if xad:
runhook( runhook(
self.log, self.log,
@ -3942,7 +4030,13 @@ class Up2k(object):
if not ANYWIN and sprs and sz > 1024 * 1024: if not ANYWIN and sprs and sz > 1024 * 1024:
fs = self.fstab.get(pdir) fs = self.fstab.get(pdir)
if fs != "ok": if fs == "ok":
pass
elif "sparse" in self.flags[job["ptop"]]:
t = "volflag 'sparse' is forcing use of sparse files for uploads to [%s]"
self.log(t % (job["ptop"],))
relabel = True
else:
relabel = True relabel = True
f.seek(1024 * 1024 - 1) f.seek(1024 * 1024 - 1)
f.write(b"e") f.write(b"e")

View file

@ -494,6 +494,7 @@ html.dz {
text-shadow: none; text-shadow: none;
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
html.dy { html.dy {
--fg: #000; --fg: #000;
@ -603,6 +604,7 @@ html {
color: var(--fg); color: var(--fg);
background: var(--bgg); background: var(--bgg);
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
text-shadow: 1px 1px 0px var(--bg-max); text-shadow: 1px 1px 0px var(--bg-max);
} }
html, body { html, body {
@ -611,6 +613,7 @@ html, body {
} }
pre, code, tt, #doc, #doc>code { pre, code, tt, #doc, #doc>code {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
.ayjump { .ayjump {
position: fixed; position: fixed;
@ -759,6 +762,7 @@ html #files.hhpick thead th {
} }
#files tbody td:nth-child(3) { #files tbody td:nth-child(3) {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
text-align: right; text-align: right;
padding-right: 1em; padding-right: 1em;
white-space: nowrap; white-space: nowrap;
@ -821,6 +825,7 @@ html.y #path a:hover {
.logue.raw { .logue.raw {
white-space: pre; white-space: pre;
font-family: 'scp', 'consolas', monospace; font-family: 'scp', 'consolas', monospace;
font-family: var(--font-mono), 'scp', 'consolas', monospace;
} }
#doc>iframe, #doc>iframe,
.logue>iframe { .logue>iframe {
@ -1417,6 +1422,7 @@ input[type="checkbox"]:checked+label {
} }
html.dz input { html.dz input {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
.opwide div>span>input+label { .opwide div>span>input+label {
padding: .3em 0 .3em .3em; padding: .3em 0 .3em .3em;
@ -1702,6 +1708,7 @@ html.y #tree.nowrap .ntree a+a:hover {
} }
.ntree a:first-child { .ntree a:first-child {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
font-size: 1.2em; font-size: 1.2em;
line-height: 0; line-height: 0;
} }
@ -1832,6 +1839,10 @@ html.y #tree.nowrap .ntree a+a:hover {
margin: 0; margin: 0;
padding: 0; padding: 0;
} }
#unpost td:nth-child(3),
#unpost td:nth-child(4) {
text-align: right;
}
#rui { #rui {
background: #fff; background: #fff;
background: var(--bg); background: var(--bg);
@ -1859,6 +1870,7 @@ html.y #tree.nowrap .ntree a+a:hover {
} }
#rn_vadv input { #rn_vadv input {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
#rui td+td, #rui td+td,
#rui td input[type="text"] { #rui td input[type="text"] {
@ -1922,6 +1934,7 @@ html.y #doc {
#doc.mdo { #doc.mdo {
white-space: normal; white-space: normal;
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
} }
#doc.prism * { #doc.prism * {
line-height: 1.5em; line-height: 1.5em;
@ -1981,6 +1994,7 @@ a.btn,
} }
#hkhelp td:first-child { #hkhelp td:first-child {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
html.noscroll, html.noscroll,
html.noscroll .sbar { html.noscroll .sbar {
@ -2490,6 +2504,7 @@ html.y #bbox-overlay figcaption a {
} }
#op_up2k.srch td.prog { #op_up2k.srch td.prog {
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
font-size: 1em; font-size: 1em;
width: auto; width: auto;
} }
@ -2504,6 +2519,7 @@ html.y #bbox-overlay figcaption a {
white-space: nowrap; white-space: nowrap;
display: inline-block; display: inline-block;
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
#u2etas.o { #u2etas.o {
width: 20em; width: 20em;
@ -2573,6 +2589,7 @@ html.y #bbox-overlay figcaption a {
#u2cards span { #u2cards span {
color: var(--fg-max); color: var(--fg-max);
font-family: 'scp', monospace; font-family: 'scp', monospace;
font-family: var(--font-mono), 'scp', monospace;
} }
#u2cards > a:nth-child(4) > span { #u2cards > a:nth-child(4) > span {
display: inline-block; display: inline-block;
@ -2738,6 +2755,7 @@ html.b #u2conf a.b:hover {
} }
.prog { .prog {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
} }
#u2tab span.inf, #u2tab span.inf,
#u2tab span.ok, #u2tab span.ok,

View file

@ -7,9 +7,9 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8, minimum-scale=0.6"> <meta name="viewport" content="width=device-width, initial-scale=0.8, minimum-scale=0.6">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/browser.css?_={{ ts }}">
{{ html_head }}
{%- if css %} {%- if css %}
<link rel="stylesheet" media="screen" href="{{ css }}_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ css }}_={{ ts }}">
{%- endif %} {%- endif %}

View file

@ -102,7 +102,7 @@ var Ls = {
"access": " access", "access": " access",
"ot_close": "close submenu", "ot_close": "close submenu",
"ot_search": "search for files by attributes, path / name, music tags, or any combination of those$N$N&lt;code&gt;foo bar&lt;/code&gt; = must contain both «foo» and «bar»,$N&lt;code&gt;foo -bar&lt;/code&gt; = must contain «foo» but not «bar»,$N&lt;code&gt;^yana .opus$&lt;/code&gt; = start with «yana» and be an «opus» file$N&lt;code&gt;&quot;try unite&quot;&lt;/code&gt; = contain exactly «try unite»$N$Nthe date format is iso-8601, like$N&lt;code&gt;2009-12-31&lt;/code&gt; or &lt;code&gt;2020-09-12 23:30:00&lt;/code&gt;", "ot_search": "search for files by attributes, path / name, music tags, or any combination of those$N$N&lt;code&gt;foo bar&lt;/code&gt; = must contain both «foo» and «bar»,$N&lt;code&gt;foo -bar&lt;/code&gt; = must contain «foo» but not «bar»,$N&lt;code&gt;^yana .opus$&lt;/code&gt; = start with «yana» and be an «opus» file$N&lt;code&gt;&quot;try unite&quot;&lt;/code&gt; = contain exactly «try unite»$N$Nthe date format is iso-8601, like$N&lt;code&gt;2009-12-31&lt;/code&gt; or &lt;code&gt;2020-09-12 23:30:00&lt;/code&gt;",
"ot_unpost": "unpost: delete your recent uploads", "ot_unpost": "unpost: delete your recent uploads, or abort unfinished ones",
"ot_bup": "bup: basic uploader, even supports netscape 4.0", "ot_bup": "bup: basic uploader, even supports netscape 4.0",
"ot_mkdir": "mkdir: create a new directory", "ot_mkdir": "mkdir: create a new directory",
"ot_md": "new-md: create a new markdown document", "ot_md": "new-md: create a new markdown document",
@ -240,13 +240,14 @@ var Ls = {
"ml_drc": "dynamic range compressor", "ml_drc": "dynamic range compressor",
"mt_shuf": "shuffle the songs in each folder\">🔀", "mt_shuf": "shuffle the songs in each folder\">🔀",
"mt_aplay": "autoplay if there is a song-ID in the link you clicked to access the server$N$Ndisabling this will also stop the page URL from being updated with song-IDs when playing music, to prevent autoplay if these settings are lost but the URL remains\">a▶",
"mt_preload": "start loading the next song near the end for gapless playback\">preload", "mt_preload": "start loading the next song near the end for gapless playback\">preload",
"mt_prescan": "go to the next folder before the last song$Nends, keeping the webbrowser happy$Nso it doesn't stop the playback\">nav", "mt_prescan": "go to the next folder before the last song$Nends, keeping the webbrowser happy$Nso it doesn't stop the playback\">nav",
"mt_fullpre": "try to preload the entire song;$N✅ enable on <b>unreliable</b> connections,$N❌ <b>disable</b> on slow connections probably\">full", "mt_fullpre": "try to preload the entire song;$N✅ enable on <b>unreliable</b> connections,$N❌ <b>disable</b> on slow connections probably\">full",
"mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s", "mt_waves": "waveform seekbar:$Nshow audio amplitude in the scrubber\">~s",
"mt_npclip": "show buttons for clipboarding the currently playing song\">/np", "mt_npclip": "show buttons for clipboarding the currently playing song\">/np",
"mt_octl": "os integration (media hotkeys / osd)\">os-ctl", "mt_octl": "os integration (media hotkeys / osd)\">os-ctl",
"mt_oseek": "allow seeking through os integration\">seek", "mt_oseek": "allow seeking through os integration$N$Nnote: on some devices (iPhones),$Nthis replaces the next-song button\">seek",
"mt_oscv": "show album cover in osd\">art", "mt_oscv": "show album cover in osd\">art",
"mt_follow": "keep the playing track scrolled into view\">🎯", "mt_follow": "keep the playing track scrolled into view\">🎯",
"mt_compact": "compact controls\">⟎", "mt_compact": "compact controls\">⟎",
@ -388,6 +389,8 @@ var Ls = {
"md_eshow": "cannot render ", "md_eshow": "cannot render ",
"md_off": "[📜<em>readme</em>] disabled in [⚙️] -- document hidden", "md_off": "[📜<em>readme</em>] disabled in [⚙️] -- document hidden",
"badreply": "Failed to parse reply from server",
"xhr403": "403: Access denied\n\ntry pressing F5, maybe you got logged out", "xhr403": "403: Access denied\n\ntry pressing F5, maybe you got logged out",
"cf_ok": "sorry about that -- DD" + wah + "oS protection kicked in\n\nthings should resume in about 30 sec\n\nif nothing happens, hit F5 to reload the page", "cf_ok": "sorry about that -- DD" + wah + "oS protection kicked in\n\nthings should resume in about 30 sec\n\nif nothing happens, hit F5 to reload the page",
"tl_xe1": "could not list subfolders:\n\nerror ", "tl_xe1": "could not list subfolders:\n\nerror ",
@ -409,7 +412,7 @@ var Ls = {
"fz_zipd": "zip with traditional cp437 filenames, for really old software", "fz_zipd": "zip with traditional cp437 filenames, for really old software",
"fz_zipc": "cp437 with crc32 computed early,$Nfor MS-DOS PKZIP v2.04g (october 1993)$N(takes longer to process before download can start)", "fz_zipc": "cp437 with crc32 computed early,$Nfor MS-DOS PKZIP v2.04g (october 1993)$N(takes longer to process before download can start)",
"un_m1": "you can delete your recent uploads below", "un_m1": "you can delete your recent uploads (or abort unfinished ones) below",
"un_upd": "refresh", "un_upd": "refresh",
"un_m4": "or share the files visible below:", "un_m4": "or share the files visible below:",
"un_ulist": "show", "un_ulist": "show",
@ -418,12 +421,15 @@ var Ls = {
"un_fclr": "clear filter", "un_fclr": "clear filter",
"un_derr": 'unpost-delete failed:\n', "un_derr": 'unpost-delete failed:\n',
"un_f5": 'something broke, please try a refresh or hit F5', "un_f5": 'something broke, please try a refresh or hit F5',
"un_nou": '<b>warning:</b> server too busy to show unfinished uploads; click the "refresh" link in a bit',
"un_noc": '<b>warning:</b> unpost of fully uploaded files is not enabled/permitted in server config',
"un_max": "showing first 2000 files (use the filter)", "un_max": "showing first 2000 files (use the filter)",
"un_avail": "{0} uploads can be deleted", "un_avail": "{0} recent uploads can be deleted<br />{1} unfinished ones can be aborted",
"un_m2": "sorted by upload time &ndash; most recent first:", "un_m2": "sorted by upload time; most recent first:",
"un_no1": "sike! no uploads are sufficiently recent", "un_no1": "sike! no uploads are sufficiently recent",
"un_no2": "sike! no uploads matching that filter are sufficiently recent", "un_no2": "sike! no uploads matching that filter are sufficiently recent",
"un_next": "delete the next {0} files below", "un_next": "delete the next {0} files below",
"un_abrt": "abort",
"un_del": "delete", "un_del": "delete",
"un_m3": "loading your recent uploads...", "un_m3": "loading your recent uploads...",
"un_busy": "deleting {0} files...", "un_busy": "deleting {0} files...",
@ -737,13 +743,14 @@ var Ls = {
"ml_drc": "compressor (volum-utjevning)", "ml_drc": "compressor (volum-utjevning)",
"mt_shuf": "sangene i hver mappe$Nspilles i tilfeldig rekkefølge\">🔀", "mt_shuf": "sangene i hver mappe$Nspilles i tilfeldig rekkefølge\">🔀",
"mt_aplay": "forsøk å starte avspilling hvis linken du klikket på for å åpne nettsiden inneholder en sang-ID$N$Nhvis denne deaktiveres så vil heller ikke nettside-URLen bli oppdatert med sang-ID'er når musikk spilles, i tilfelle innstillingene skulle gå tapt og nettsiden lastes på ny\">a▶",
"mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles", "mt_preload": "hent ned litt av neste sang i forkant,$Nslik at pausen i overgangen blir mindre\">forles",
"mt_prescan": "ved behov, bla til neste mappe$Nslik at nettleseren lar oss$Nfortsette å spille musikk\">bla", "mt_prescan": "ved behov, bla til neste mappe$Nslik at nettleseren lar oss$Nfortsette å spille musikk\">bla",
"mt_fullpre": "hent ned hele neste sang, ikke bare litt:$N✅ skru på hvis nettet ditt er <b>ustabilt</b>,$N❌ skru av hvis nettet ditt er <b>tregt</b>\">full", "mt_fullpre": "hent ned hele neste sang, ikke bare litt:$N✅ skru på hvis nettet ditt er <b>ustabilt</b>,$N❌ skru av hvis nettet ditt er <b>tregt</b>\">full",
"mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s", "mt_waves": "waveform seekbar:$Nvis volumkurve i avspillingsfeltet\">~s",
"mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np", "mt_npclip": "vis knapper for å kopiere info om sangen du hører på\">/np",
"mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl", "mt_octl": "integrering med operativsystemet (fjernkontroll, info-skjerm)\">os-ctl",
"mt_oseek": "tillat spoling med fjernkontroll\">spoling", "mt_oseek": "tillat spoling med fjernkontroll$N$Nmerk: på noen enheter (iPhones) så vil$Ndette erstatte knappen for neste sang\">spoling",
"mt_oscv": "vis album-cover på infoskjermen\">bilde", "mt_oscv": "vis album-cover på infoskjermen\">bilde",
"mt_follow": "bla slik at sangen som spilles alltid er synlig\">🎯", "mt_follow": "bla slik at sangen som spilles alltid er synlig\">🎯",
"mt_compact": "tettpakket avspillerpanel\">⟎", "mt_compact": "tettpakket avspillerpanel\">⟎",
@ -885,6 +892,8 @@ var Ls = {
"md_eshow": "viser forenklet ", "md_eshow": "viser forenklet ",
"md_off": "[📜<em>readme</em>] er avskrudd i [⚙️] -- dokument skjult", "md_off": "[📜<em>readme</em>] er avskrudd i [⚙️] -- dokument skjult",
"badreply": "Ugyldig svar ifra serveren",
"xhr403": "403: Tilgang nektet\n\nkanskje du ble logget ut? prøv å trykk F5", "xhr403": "403: Tilgang nektet\n\nkanskje du ble logget ut? prøv å trykk F5",
"cf_ok": "beklager -- liten tilfeldig kontroll, alt OK\n\nting skal fortsette om ca. 30 sekunder\n\nhvis ikkeno skjer, trykk F5 for å laste siden på nytt", "cf_ok": "beklager -- liten tilfeldig kontroll, alt OK\n\nting skal fortsette om ca. 30 sekunder\n\nhvis ikkeno skjer, trykk F5 for å laste siden på nytt",
"tl_xe1": "kunne ikke hente undermapper:\n\nfeil ", "tl_xe1": "kunne ikke hente undermapper:\n\nfeil ",
@ -906,7 +915,7 @@ var Ls = {
"fz_zipd": "zip med filnavn i cp437, for høggamle maskiner", "fz_zipd": "zip med filnavn i cp437, for høggamle maskiner",
"fz_zipc": "cp437 med tidlig crc32,$Nfor MS-DOS PKZIP v2.04g (oktober 1993)$N(øker behandlingstid på server)", "fz_zipc": "cp437 med tidlig crc32,$Nfor MS-DOS PKZIP v2.04g (oktober 1993)$N(øker behandlingstid på server)",
"un_m1": "nedenfor kan du angre / slette filer som du nylig har lastet opp", "un_m1": "nedenfor kan du angre / slette filer som du nylig har lastet opp, eller avbryte ufullstendige opplastninger",
"un_upd": "oppdater", "un_upd": "oppdater",
"un_m4": "eller hvis du vil dele nedlastnings-lenkene:", "un_m4": "eller hvis du vil dele nedlastnings-lenkene:",
"un_ulist": "vis", "un_ulist": "vis",
@ -915,12 +924,15 @@ var Ls = {
"un_fclr": "nullstill filter", "un_fclr": "nullstill filter",
"un_derr": 'unpost-sletting feilet:\n', "un_derr": 'unpost-sletting feilet:\n',
"un_f5": 'noe gikk galt, prøv å oppdatere listen eller trykk F5', "un_f5": 'noe gikk galt, prøv å oppdatere listen eller trykk F5',
"un_nou": '<b>advarsel:</b> kan ikke vise ufullstendige opplastninger akkurat nå; klikk på oppdater-linken om litt',
"un_noc": '<b>advarsel:</b> angring av fullførte opplastninger er deaktivert i serverkonfigurasjonen',
"un_max": "viser de første 2000 filene (bruk filteret for å innsnevre)", "un_max": "viser de første 2000 filene (bruk filteret for å innsnevre)",
"un_avail": "{0} filer kan slettes", "un_avail": "{0} nylig opplastede filer kan slettes<br />{1} ufullstendige opplastninger kan avbrytes",
"un_m2": "sortert etter opplastningstid &ndash; nyeste først:", "un_m2": "sortert etter opplastningstid; nyeste først:",
"un_no1": "men nei, her var det jaggu ikkeno som slettes kan", "un_no1": "men nei, her var det jaggu ikkeno som slettes kan",
"un_no2": "men nei, her var det jaggu ingenting som passet overens med filteret", "un_no2": "men nei, her var det jaggu ingenting som passet overens med filteret",
"un_next": "slett de neste {0} filene nedenfor", "un_next": "slett de neste {0} filene nedenfor",
"un_abrt": "avbryt",
"un_del": "slett", "un_del": "slett",
"un_m3": "henter listen med nylig opplastede filer...", "un_m3": "henter listen med nylig opplastede filer...",
"un_busy": "sletter {0} filer...", "un_busy": "sletter {0} filer...",
@ -967,7 +979,7 @@ var Ls = {
"u_emtleakf": 'prøver følgende:\n<ul><li>trykk F5 for å laste siden på nytt</li><li>så skru på <code>🥔</code> ("enkelt UI") i opplasteren</li><li>og forsøk den samme opplastningen igjen</li></ul>\nPS: Firefox <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=1790500">fikser forhåpentligvis feilen</a> en eller annen gang', "u_emtleakf": 'prøver følgende:\n<ul><li>trykk F5 for å laste siden på nytt</li><li>så skru på <code>🥔</code> ("enkelt UI") i opplasteren</li><li>og forsøk den samme opplastningen igjen</li></ul>\nPS: Firefox <a href="https://bugzilla.mozilla.org/show_bug.cgi?id=1790500">fikser forhåpentligvis feilen</a> en eller annen gang',
"u_s404": "ikke funnet på serveren", "u_s404": "ikke funnet på serveren",
"u_expl": "forklar", "u_expl": "forklar",
"u_maxconn": "de fleste nettlesere tillater ikke mer enn 6, men firefox lar deg øke grensen med <code>connections-per-server</code> in <code>about:config</code>", "u_maxconn": "de fleste nettlesere tillater ikke mer enn 6, men firefox lar deg øke grensen med <code>connections-per-server</code> i <code>about:config</code>",
"u_tu": '<p class="warn">ADVARSEL: turbo er på, <span>&nbsp;avbrutte opplastninger vil muligens ikke oppdages og gjenopptas; hold musepekeren over turbo-knappen for mer info</span></p>', "u_tu": '<p class="warn">ADVARSEL: turbo er på, <span>&nbsp;avbrutte opplastninger vil muligens ikke oppdages og gjenopptas; hold musepekeren over turbo-knappen for mer info</span></p>',
"u_ts": '<p class="warn">ADVARSEL: turbo er på, <span>&nbsp;søkeresultater kan være feil; hold musepekeren over turbo-knappen for mer info</span></p>', "u_ts": '<p class="warn">ADVARSEL: turbo er på, <span>&nbsp;søkeresultater kan være feil; hold musepekeren over turbo-knappen for mer info</span></p>',
"u_turbo_c": "turbo er deaktivert i serverkonfigurasjonen", "u_turbo_c": "turbo er deaktivert i serverkonfigurasjonen",
@ -1024,7 +1036,7 @@ modal.load();
ebi('ops').innerHTML = ( ebi('ops').innerHTML = (
'<a href="#" data-dest="" tt="' + L.ot_close + '">--</a>' + '<a href="#" data-dest="" tt="' + L.ot_close + '">--</a>' +
'<a href="#" data-perm="read" data-dep="idx" data-dest="search" tt="' + L.ot_search + '">🔎</a>' + '<a href="#" data-perm="read" data-dep="idx" data-dest="search" tt="' + L.ot_search + '">🔎</a>' +
(have_del && have_unpost ? '<a href="#" data-dest="unpost" data-dep="idx" tt="' + L.ot_unpost + '">🧯</a>' : '') + (have_del ? '<a href="#" data-dest="unpost" tt="' + L.ot_unpost + '">🧯</a>' : '') +
'<a href="#" data-dest="up2k">🚀</a>' + '<a href="#" data-dest="up2k">🚀</a>' +
'<a href="#" data-perm="write" data-dest="bup" tt="' + L.ot_bup + '">🎈</a>' + '<a href="#" data-perm="write" data-dest="bup" tt="' + L.ot_bup + '">🎈</a>' +
'<a href="#" data-perm="write" data-dest="mkdir" tt="' + L.ot_mkdir + '">📂</a>' + '<a href="#" data-perm="write" data-dest="mkdir" tt="' + L.ot_mkdir + '">📂</a>' +
@ -1401,6 +1413,7 @@ var mpl = (function () {
ebi('op_player').innerHTML = ( ebi('op_player').innerHTML = (
'<div><h3>' + L.cl_opts + '</h3><div>' + '<div><h3>' + L.cl_opts + '</h3><div>' +
'<a href="#" class="tgl btn" id="au_shuf" tt="' + L.mt_shuf + '</a>' + '<a href="#" class="tgl btn" id="au_shuf" tt="' + L.mt_shuf + '</a>' +
'<a href="#" class="tgl btn" id="au_aplay" tt="' + L.mt_aplay + '</a>' +
'<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' + '<a href="#" class="tgl btn" id="au_preload" tt="' + L.mt_preload + '</a>' +
'<a href="#" class="tgl btn" id="au_prescan" tt="' + L.mt_prescan + '</a>' + '<a href="#" class="tgl btn" id="au_prescan" tt="' + L.mt_prescan + '</a>' +
'<a href="#" class="tgl btn" id="au_fullpre" tt="' + L.mt_fullpre + '</a>' + '<a href="#" class="tgl btn" id="au_fullpre" tt="' + L.mt_fullpre + '</a>' +
@ -1446,6 +1459,7 @@ var mpl = (function () {
bcfg_bind(r, 'shuf', 'au_shuf', false, function () { bcfg_bind(r, 'shuf', 'au_shuf', false, function () {
mp.read_order(); // don't bind mp.read_order(); // don't bind
}); });
bcfg_bind(r, 'aplay', 'au_aplay', true);
bcfg_bind(r, 'preload', 'au_preload', true); bcfg_bind(r, 'preload', 'au_preload', true);
bcfg_bind(r, 'prescan', 'au_prescan', true); bcfg_bind(r, 'prescan', 'au_prescan', true);
bcfg_bind(r, 'fullpre', 'au_fullpre', false); bcfg_bind(r, 'fullpre', 'au_fullpre', false);
@ -3103,7 +3117,9 @@ function play(tid, is_ev, seek) {
try { try {
mp.nopause(); mp.nopause();
mp.au.play(); if (mpl.aplay || is_ev !== -1)
mp.au.play();
if (mp.au.paused) if (mp.au.paused)
autoplay_blocked(seek); autoplay_blocked(seek);
else if (seek) { else if (seek) {
@ -3113,7 +3129,8 @@ function play(tid, is_ev, seek) {
if (!seek && !ebi('unsearch')) { if (!seek && !ebi('unsearch')) {
var o = ebi(oid); var o = ebi(oid);
o.setAttribute('id', 'thx_js'); o.setAttribute('id', 'thx_js');
sethash(oid); if (mpl.aplay)
sethash(oid);
o.setAttribute('id', oid); o.setAttribute('id', oid);
} }
@ -3275,9 +3292,9 @@ function eval_hash() {
if (mtype == 'a') { if (mtype == 'a') {
if (!ts) if (!ts)
return play(id); return play(id, -1);
return play(id, false, ts); return play(id, -1, ts);
} }
if (mtype == 'g') { if (mtype == 'g') {
@ -5547,7 +5564,7 @@ document.onkeydown = function (e) {
function xhr_search_results() { function xhr_search_results() {
if (this.status !== 200) { if (this.status !== 200) {
var msg = unpre(this.responseText); var msg = hunpre(this.responseText);
srch_msg(true, "http " + this.status + ": " + msg); srch_msg(true, "http " + this.status + ": " + msg);
search_in_progress = 0; search_in_progress = 0;
return; return;
@ -7487,7 +7504,7 @@ var msel = (function () {
xhrchk(this, L.fd_xe1, L.fd_xe2); xhrchk(this, L.fd_xe1, L.fd_xe2);
if (this.status !== 201) { if (this.status !== 201) {
sf.textContent = 'error: ' + unpre(this.responseText); sf.textContent = 'error: ' + hunpre(this.responseText);
return; return;
} }
@ -7535,7 +7552,7 @@ var msel = (function () {
xhrchk(this, L.fsm_xe1, L.fsm_xe2); xhrchk(this, L.fsm_xe1, L.fsm_xe2);
if (this.status < 200 || this.status > 201) { if (this.status < 200 || this.status > 201) {
sf.textContent = 'error: ' + unpre(this.responseText); sf.textContent = 'error: ' + hunpre(this.responseText);
return; return;
} }
@ -7569,12 +7586,25 @@ var globalcss = (function () {
var css = ds[b].cssText.split(/\burl\(/g); var css = ds[b].cssText.split(/\burl\(/g);
ret += css[0]; ret += css[0];
for (var c = 1; c < css.length; c++) { for (var c = 1; c < css.length; c++) {
var delim = (/^["']/.exec(css[c])) ? css[c].slice(0, 1) : ''; var m = /(^ *["']?)(.*)/.exec(css[c]),
ret += 'url(' + delim + ((css[c].slice(0, 8).indexOf('://') + 1 || css[c].startsWith('/')) ? '' : base) + delim = m[1],
css[c].slice(delim ? 1 : 0); ctxt = m[2],
is_abs = /^\/|[^)/:]+:\/\//.exec(ctxt);
ret += 'url(' + delim + (is_abs ? '' : base) + ctxt;
} }
ret += '\n'; ret += '\n';
} }
if (ret.indexOf('\n@import') + 1) {
var c0 = ret.split('\n'),
c1 = [],
c2 = [];
for (var a = 0; a < c0.length; a++)
(c0[a].startsWith('@import') ? c1 : c2).push(c0[a]);
ret = c1.concat(c2).join('\n');
}
} }
catch (ex) { catch (ex) {
console.log('could not read css', a, base); console.log('could not read css', a, base);
@ -7858,15 +7888,39 @@ var unpost = (function () {
if (!xhrchk(this, L.fu_xe1, L.fu_xe2)) if (!xhrchk(this, L.fu_xe1, L.fu_xe2))
return ebi('op_unpost').innerHTML = L.fu_xe1; return ebi('op_unpost').innerHTML = L.fu_xe1;
var res = JSON.parse(this.responseText); try {
var ores = JSON.parse(this.responseText);
}
catch (ex) {
return ebi('op_unpost').innerHTML = '<p>' + L.badreply + ':</p>' + unpre(this.responseText);
}
if (ores.u.length == 1 && ores.u[0].timeout) {
html.push('<p>' + L.un_nou + '</p>');
ores.u = [];
}
if (ores.c.length == 1 && ores.c[0].kinshi) {
html.push('<p>' + L.un_noc + '</p>');
ores.c = [];
}
for (var a = 0; a < ores.u.length; a++)
ores.u[a].k = 'u';
for (var a = 0; a < ores.c.length; a++)
ores.c[a].k = 'c';
var res = ores.u.concat(ores.c);
if (res.length) { if (res.length) {
if (res.length == 2000) if (res.length == 2000)
html.push("<p>" + L.un_max); html.push("<p>" + L.un_max);
else else
html.push("<p>" + L.un_avail.format(res.length)); html.push("<p>" + L.un_avail.format(ores.c.length, ores.u.length));
html.push(" &ndash; " + L.un_m2 + "</p>"); html.push("<br />" + L.un_m2 + "</p>");
html.push("<table><thead><tr><td></td><td>time</td><td>size</td><td>file</td></tr></thead><tbody>"); html.push("<table><thead><tr><td></td><td>time</td><td>size</td><td>done</td><td>file</td></tr></thead><tbody>");
} }
else else
html.push('-- <em>' + (filt.value ? L.un_no2 : L.un_no1) + '</em>'); html.push('-- <em>' + (filt.value ? L.un_no2 : L.un_no1) + '</em>');
@ -7879,10 +7933,13 @@ var unpost = (function () {
'<tr><td></td><td colspan="3" style="padding:.5em">' + '<tr><td></td><td colspan="3" style="padding:.5em">' +
'<a me="' + me + '" class="n' + a + '" n2="' + (a + mods[b]) + '<a me="' + me + '" class="n' + a + '" n2="' + (a + mods[b]) +
'" href="#">' + L.un_next.format(Math.min(mods[b], res.length - a)) + '</a></td></tr>'); '" href="#">' + L.un_next.format(Math.min(mods[b], res.length - a)) + '</a></td></tr>');
var done = res[a].k == 'c';
html.push( html.push(
'<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + L.un_del + '</a></td>' + '<tr><td><a me="' + me + '" class="n' + a + '" href="#">' + (done ? L.un_del : L.un_abrt) + '</a></td>' +
'<td>' + unix2iso(res[a].at) + '</td>' + '<td>' + unix2iso(res[a].at) + '</td>' +
'<td>' + res[a].sz + '</td>' + '<td>' + ('' + res[a].sz).replace(/\B(?=(\d{3})+(?!\d))/g, " ") + '</td>' +
(done ? '<td>100%</td>' : '<td>' + res[a].pd + '%</td>') +
'<td>' + linksplit(res[a].vp).join('<span> / </span>') + '</td></tr>'); '<td>' + linksplit(res[a].vp).join('<span> / </span>') + '</td></tr>');
} }
@ -7968,7 +8025,7 @@ var unpost = (function () {
var xhr = new XHR(); var xhr = new XHR();
xhr.n = n; xhr.n = n;
xhr.n2 = n2; xhr.n2 = n2;
xhr.open('POST', SR + '/?delete&lim=' + req.length, true); xhr.open('POST', SR + '/?delete&unpost&lim=' + req.length, true);
xhr.onload = xhr.onerror = unpost_delete_cb; xhr.onload = xhr.onerror = unpost_delete_cb;
xhr.send(JSON.stringify(req)); xhr.send(JSON.stringify(req));
}; };

View file

@ -6,12 +6,12 @@
<title>{{ title }}</title> <title>{{ title }}</title>
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
{{ html_head }}
<style> <style>
html{font-family:sans-serif} html{font-family:sans-serif}
td{border:1px solid #999;border-width:1px 1px 0 0;padding:0 5px} td{border:1px solid #999;border-width:1px 1px 0 0;padding:0 5px}
a{display:block} a{display:block}
</style> </style>
{{ html_head }}
</head> </head>
<body> <body>

View file

@ -2,6 +2,7 @@ html, body {
color: #333; color: #333;
background: #eee; background: #eee;
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
line-height: 1.5em; line-height: 1.5em;
} }
html.y #helpbox a { html.y #helpbox a {
@ -67,6 +68,7 @@ a {
position: relative; position: relative;
display: inline-block; display: inline-block;
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
font-weight: bold; font-weight: bold;
font-size: 1.3em; font-size: 1.3em;
line-height: .1em; line-height: .1em;

View file

@ -4,12 +4,12 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7"> <meta name="viewport" content="width=device-width, initial-scale=0.7">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/md.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/md.css?_={{ ts }}">
{%- if edit %} {%- if edit %}
<link rel="stylesheet" href="{{ r }}/.cpr/md2.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/md2.css?_={{ ts }}">
{%- endif %} {%- endif %}
{{ html_head }}
</head> </head>
<body> <body>
<div id="mn"></div> <div id="mn"></div>

View file

@ -512,13 +512,6 @@ dom_navtgl.onclick = function () {
redraw(); redraw();
}; };
if (!HTTPS && location.hostname != '127.0.0.1') try {
ebi('edit2').onclick = function (e) {
toast.err(0, "the fancy editor is only available over https");
return ev(e);
}
} catch (ex) { }
if (sread('hidenav') == 1) if (sread('hidenav') == 1)
dom_navtgl.onclick(); dom_navtgl.onclick();

View file

@ -9,7 +9,7 @@
width: calc(100% - 56em); width: calc(100% - 56em);
} }
#mw { #mw {
left: calc(100% - 55em); left: max(0em, calc(100% - 55em));
overflow-y: auto; overflow-y: auto;
position: fixed; position: fixed;
bottom: 0; bottom: 0;
@ -56,6 +56,7 @@
padding: 0; padding: 0;
margin: 0; margin: 0;
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
white-space: pre-wrap; white-space: pre-wrap;
word-break: break-word; word-break: break-word;
overflow-wrap: break-word; overflow-wrap: break-word;

View file

@ -368,14 +368,14 @@ function save(e) {
function save_cb() { function save_cb() {
if (this.status !== 200) if (this.status !== 200)
return toast.err(0, 'Error! The file was NOT saved.\n\n' + this.status + ": " + (this.responseText + '').replace(/^<pre>/, "")); return toast.err(0, 'Error! The file was NOT saved.\n\nError ' + this.status + ":\n" + unpre(this.responseText));
var r; var r;
try { try {
r = JSON.parse(this.responseText); r = JSON.parse(this.responseText);
} }
catch (ex) { catch (ex) {
return toast.err(0, 'Failed to parse reply from server:\n\n' + this.responseText); return toast.err(0, 'Error! The file was likely NOT saved.\n\nFailed to parse reply from server:\n\n' + unpre(this.responseText));
} }
if (!r.ok) { if (!r.ok) {
@ -418,7 +418,7 @@ function run_savechk(lastmod, txt, btn, ntry) {
function savechk_cb() { function savechk_cb() {
if (this.status !== 200) if (this.status !== 200)
return toast.err(0, 'Error! The file was NOT saved.\n\n' + this.status + ": " + (this.responseText + '').replace(/^<pre>/, "")); return toast.err(0, 'Error! The file was NOT saved.\n\nError ' + this.status + ":\n" + unpre(this.responseText));
var doc1 = this.txt.replace(/\r\n/g, "\n"); var doc1 = this.txt.replace(/\r\n/g, "\n");
var doc2 = this.responseText.replace(/\r\n/g, "\n"); var doc2 = this.responseText.replace(/\r\n/g, "\n");

View file

@ -17,6 +17,7 @@ html, body {
padding: 0; padding: 0;
min-height: 100%; min-height: 100%;
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
background: #f7f7f7; background: #f7f7f7;
color: #333; color: #333;
} }

View file

@ -4,11 +4,11 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.7"> <meta name="viewport" content="width=device-width, initial-scale=0.7">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/mde.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/mde.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/deps/mini-fa.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/deps/mini-fa.css?_={{ ts }}">
<link rel="stylesheet" href="{{ r }}/.cpr/deps/easymde.css?_={{ ts }}"> <link rel="stylesheet" href="{{ r }}/.cpr/deps/easymde.css?_={{ ts }}">
{{ html_head }}
</head> </head>
<body> <body>
<div id="mw"> <div id="mw">

View file

@ -134,14 +134,14 @@ function save(mde) {
function save_cb() { function save_cb() {
if (this.status !== 200) if (this.status !== 200)
return toast.err(0, 'Error! The file was NOT saved.\n\n' + this.status + ": " + (this.responseText + '').replace(/^<pre>/, "")); return toast.err(0, 'Error! The file was NOT saved.\n\nError ' + this.status + ":\n" + unpre(this.responseText));
var r; var r;
try { try {
r = JSON.parse(this.responseText); r = JSON.parse(this.responseText);
} }
catch (ex) { catch (ex) {
return toast.err(0, 'Failed to parse reply from server:\n\n' + this.responseText); return toast.err(0, 'Error! The file was likely NOT saved.\n\nFailed to parse reply from server:\n\n' + unpre(this.responseText));
} }
if (!r.ok) { if (!r.ok) {
@ -180,7 +180,7 @@ function save_cb() {
function save_chk() { function save_chk() {
if (this.status !== 200) if (this.status !== 200)
return toast.err(0, 'Error! The file was NOT saved.\n\n' + this.status + ": " + (this.responseText + '').replace(/^<pre>/, "")); return toast.err(0, 'Error! The file was NOT saved.\n\nError ' + this.status + ":\n" + unpre(this.responseText));
var doc1 = this.txt.replace(/\r\n/g, "\n"); var doc1 = this.txt.replace(/\r\n/g, "\n");
var doc2 = this.responseText.replace(/\r\n/g, "\n"); var doc2 = this.responseText.replace(/\r\n/g, "\n");

View file

@ -1,3 +1,8 @@
:root {
--font-main: sans-serif;
--font-serif: serif;
--font-mono: 'scp';
}
html,body,tr,th,td,#files,a { html,body,tr,th,td,#files,a {
color: inherit; color: inherit;
background: none; background: none;
@ -10,6 +15,7 @@ html {
color: #ccc; color: #ccc;
background: #333; background: #333;
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
text-shadow: 1px 1px 0px #000; text-shadow: 1px 1px 0px #000;
touch-action: manipulation; touch-action: manipulation;
} }
@ -23,6 +29,7 @@ html, body {
} }
pre { pre {
font-family: monospace, monospace; font-family: monospace, monospace;
font-family: var(--font-mono), monospace, monospace;
} }
a { a {
color: #fc5; color: #fc5;

View file

@ -7,8 +7,8 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/msg.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/msg.css?_={{ ts }}">
{{ html_head }}
</head> </head>
<body> <body>

View file

@ -2,6 +2,7 @@ html {
color: #333; color: #333;
background: #f7f7f7; background: #f7f7f7;
font-family: sans-serif; font-family: sans-serif;
font-family: var(--font-main), sans-serif;
touch-action: manipulation; touch-action: manipulation;
} }
#wrap { #wrap {
@ -127,6 +128,7 @@ pre, code {
color: #480; color: #480;
background: #fff; background: #fff;
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
border: 1px solid rgba(128,128,128,0.3); border: 1px solid rgba(128,128,128,0.3);
border-radius: .2em; border-radius: .2em;
padding: .15em .2em; padding: .15em .2em;

View file

@ -7,9 +7,9 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
{{ html_head }}
</head> </head>
<body> <body>
@ -78,13 +78,15 @@
<h1 id="cc">client config:</h1> <h1 id="cc">client config:</h1>
<ul> <ul>
{% if k304 or k304vis %}
{% if k304 %} {% if k304 %}
<li><a id="h" href="{{ r }}/?k304=n">disable k304</a> (currently enabled) <li><a id="h" href="{{ r }}/?k304=n">disable k304</a> (currently enabled)
{%- else %} {%- else %}
<li><a id="i" href="{{ r }}/?k304=y" class="r">enable k304</a> (currently disabled) <li><a id="i" href="{{ r }}/?k304=y" class="r">enable k304</a> (currently disabled)
{% endif %} {% endif %}
<blockquote id="j">enabling this will disconnect your client on every HTTP 304, which can prevent some buggy proxies from getting stuck (suddenly not loading pages), <em>but</em> it will also make things slower in general</blockquote></li> <blockquote id="j">enabling this will disconnect your client on every HTTP 304, which can prevent some buggy proxies from getting stuck (suddenly not loading pages), <em>but</em> it will also make things slower in general</blockquote></li>
{% endif %}
<li><a id="k" href="{{ r }}/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li> <li><a id="k" href="{{ r }}/?reset" class="r" onclick="localStorage.clear();return true">reset client settings</a></li>
</ul> </ul>

View file

@ -6,7 +6,7 @@ var Ls = {
"d1": "tilstand", "d1": "tilstand",
"d2": "vis tilstanden til alle tråder", "d2": "vis tilstanden til alle tråder",
"e1": "last innst.", "e1": "last innst.",
"e2": "leser inn konfigurasjonsfiler på nytt$N(kontoer, volumer, volumbrytere)$Nog kartlegger alle e2ds-volumer", "e2": "leser inn konfigurasjonsfiler på nytt$N(kontoer, volumer, volumbrytere)$Nog kartlegger alle e2ds-volumer$N$Nmerk: endringer i globale parametere$Nkrever en full restart for å ta gjenge",
"f1": "du kan betrakte:", "f1": "du kan betrakte:",
"g1": "du kan laste opp til:", "g1": "du kan laste opp til:",
"cc1": "klient-konfigurasjon", "cc1": "klient-konfigurasjon",
@ -30,7 +30,7 @@ var Ls = {
}, },
"eng": { "eng": {
"d2": "shows the state of all active threads", "d2": "shows the state of all active threads",
"e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes", "e2": "reload config files (accounts/volumes/volflags),$Nand rescan all e2ds volumes$N$Nnote: any changes to global settings$Nrequire a full restart to take effect",
"u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds", "u2": "time since the last server write$N( upload / rename / ... )$N$N17d = 17 days$N1h23 = 1 hour 23 minutes$N4m56 = 4 minutes 56 seconds",
"v2": "use this server as a local HDD$N$NWARNING: this will show your password!", "v2": "use this server as a local HDD$N$NWARNING: this will show your password!",
} }

View file

@ -7,10 +7,10 @@
<meta http-equiv="X-UA-Compatible" content="IE=edge"> <meta http-equiv="X-UA-Compatible" content="IE=edge">
<meta name="viewport" content="width=device-width, initial-scale=0.8"> <meta name="viewport" content="width=device-width, initial-scale=0.8">
<meta name="theme-color" content="#333"> <meta name="theme-color" content="#333">
{{ html_head }}
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/splash.css?_={{ ts }}">
<link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}"> <link rel="stylesheet" media="screen" href="{{ r }}/.cpr/ui.css?_={{ ts }}">
<style>ul{padding-left:1.3em}li{margin:.4em 0}</style> <style>ul{padding-left:1.3em}li{margin:.4em 0}</style>
{{ html_head }}
</head> </head>
<body> <body>

View file

@ -1,4 +1,8 @@
:root { :root {
--font-main: sans-serif;
--font-serif: serif;
--font-mono: 'scp';
--fg: #ccc; --fg: #ccc;
--fg-max: #fff; --fg-max: #fff;
--bg-u2: #2b2b2b; --bg-u2: #2b2b2b;
@ -378,6 +382,7 @@ html.y textarea:focus {
.mdo code, .mdo code,
.mdo tt { .mdo tt {
font-family: 'scp', monospace, monospace; font-family: 'scp', monospace, monospace;
font-family: var(--font-mono), 'scp', monospace, monospace;
white-space: pre-wrap; white-space: pre-wrap;
word-break: break-all; word-break: break-all;
} }
@ -447,6 +452,7 @@ html.y textarea:focus {
} }
.mdo blockquote { .mdo blockquote {
font-family: serif; font-family: serif;
font-family: var(--font-serif), serif;
background: #f7f7f7; background: #f7f7f7;
border: .07em dashed #ccc; border: .07em dashed #ccc;
padding: 0 2em; padding: 0 2em;

View file

@ -1722,8 +1722,6 @@ function up2k_init(subtle) {
ebi('u2etas').style.textAlign = 'left'; ebi('u2etas').style.textAlign = 'left';
} }
etafun(); etafun();
if (pvis.act == 'bz')
pvis.changecard('bz');
} }
if (flag) { if (flag) {
@ -1859,6 +1857,9 @@ function up2k_init(subtle) {
timer.rm(donut.do); timer.rm(donut.do);
ebi('u2tabw').style.minHeight = '0px'; ebi('u2tabw').style.minHeight = '0px';
utw_minh = 0; utw_minh = 0;
if (pvis.act == 'bz')
pvis.changecard('bz');
} }
function chill(t) { function chill(t) {
@ -2256,6 +2257,7 @@ function up2k_init(subtle) {
console.log('handshake onerror, retrying', t.name, t); console.log('handshake onerror, retrying', t.name, t);
apop(st.busy.handshake, t); apop(st.busy.handshake, t);
st.todo.handshake.unshift(t); st.todo.handshake.unshift(t);
t.cooldown = Date.now() + 5000 + Math.floor(Math.random() * 3000);
t.keepalive = keepalive; t.keepalive = keepalive;
}; };
var orz = function (e) { var orz = function (e) {
@ -2263,16 +2265,26 @@ function up2k_init(subtle) {
return console.log('zombie handshake onload', t.name, t); return console.log('zombie handshake onload', t.name, t);
if (xhr.status == 200) { if (xhr.status == 200) {
try {
var response = JSON.parse(xhr.responseText);
}
catch (ex) {
apop(st.busy.handshake, t);
st.todo.handshake.unshift(t);
t.cooldown = Date.now() + 5000 + Math.floor(Math.random() * 3000);
return toast.err(0, 'Handshake error; will retry...\n\n' + L.badreply + ':\n\n' + unpre(xhr.responseText));
}
t.t_handshake = Date.now(); t.t_handshake = Date.now();
if (keepalive) { if (keepalive) {
apop(st.busy.handshake, t); apop(st.busy.handshake, t);
tasker();
return; return;
} }
if (toast.tag === t) if (toast.tag === t)
toast.ok(5, L.u_fixed); toast.ok(5, L.u_fixed);
var response = JSON.parse(xhr.responseText);
if (!response.name) { if (!response.name) {
var msg = '', var msg = '',
smsg = ''; smsg = '';
@ -2856,6 +2868,8 @@ function up2k_init(subtle) {
new_state = false; new_state = false;
fixed = true; fixed = true;
} }
if (new_state === undefined)
new_state = can_write ? false : have_up2k_idx ? true : undefined;
} }
if (new_state === undefined) if (new_state === undefined)

View file

@ -1417,9 +1417,12 @@ function lf2br(txt) {
} }
function unpre(txt) { function hunpre(txt) {
return ('' + txt).replace(/^<pre>/, ''); return ('' + txt).replace(/^<pre>/, '');
} }
function unpre(txt) {
return esc(hunpre(txt));
}
var toast = (function () { var toast = (function () {
@ -1995,15 +1998,21 @@ function xhrchk(xhr, prefix, e404, lvl, tag) {
if (tag === undefined) if (tag === undefined)
tag = prefix; tag = prefix;
var errtxt = (xhr.response && xhr.response.err) || xhr.responseText, var errtxt = ((xhr.response && xhr.response.err) || xhr.responseText) || '',
suf = '',
fun = toast[lvl || 'err'], fun = toast[lvl || 'err'],
is_cf = /[Cc]loud[f]lare|>Just a mo[m]ent|#cf-b[u]bbles|Chec[k]ing your br[o]wser|\/chall[e]nge-platform|"chall[e]nge-error|nable Ja[v]aScript and cook/.test(errtxt); is_cf = /[Cc]loud[f]lare|>Just a mo[m]ent|#cf-b[u]bbles|Chec[k]ing your br[o]wser|\/chall[e]nge-platform|"chall[e]nge-error|nable Ja[v]aScript and cook/.test(errtxt);
if (errtxt.startsWith('<pre>'))
suf = '\n\nerror-details: «' + unpre(errtxt).split('\n')[0].trim() + '»';
else
errtxt = esc(errtxt).slice(0, 32768);
if (xhr.status == 403 && !is_cf) if (xhr.status == 403 && !is_cf)
return toast.err(0, prefix + (L && L.xhr403 || "403: access denied\n\ntry pressing F5, maybe you got logged out"), tag); return toast.err(0, prefix + (L && L.xhr403 || "403: access denied\n\ntry pressing F5, maybe you got logged out") + suf, tag);
if (xhr.status == 404) if (xhr.status == 404)
return toast.err(0, prefix + e404, tag); return toast.err(0, prefix + e404 + suf, tag);
if (is_cf && (xhr.status == 403 || xhr.status == 503)) { if (is_cf && (xhr.status == 403 || xhr.status == 503)) {
var now = Date.now(), td = now - cf_cha_t; var now = Date.now(), td = now - cf_cha_t;

View file

@ -13,6 +13,9 @@
# other stuff # other stuff
## [`TODO.md`](TODO.md)
* planned features / fixes / changes
## [`example.conf`](example.conf) ## [`example.conf`](example.conf)
* example config file for `-c` * example config file for `-c`

31
docs/TODO.md Normal file
View file

@ -0,0 +1,31 @@
a living list of upcoming features / fixes / changes, very roughly in order of priority
* readme / docs
* docker ftp config
* custom-fonts (copy from issue)
* s3 speedfix
* reverseproxy/cloudflare: ensure cloudflare does not terminate https
* docker: suggest putting hists in /cfg/hists/
* [github issue #62](https://github.com/9001/copyparty/issues/62) - IdP / single-sign-on powered by a local identity provider service which is possibly hooked up to ldap or an oauth service
* secret token header between reverse-proxy and copyparty to confirm the headers are legit
* persist autogenerated volumes for db-init + nullmapping on next startup (`_map_volume` += `only_if_exist`)
* sanchk that autogenerated volumes below inaccessible parent
* disable logout links if idp detected
* download accelerator
* definitely download chunks in parallel
* maybe resumable downloads (chrome-only, jank api)
* maybe checksum validation (return sha512 of requested range in responses, and probably also warks)
* [github issue #64](https://github.com/9001/copyparty/issues/64) - dirkeys 2nd season
* popular feature request, finally time to refactor browser.js i suppose...
* [github issue #37](https://github.com/9001/copyparty/issues/37) - upload PWA
* or [maybe not](https://arstechnica.com/tech-policy/2024/02/apple-under-fire-for-disabling-iphone-web-apps-eu-asks-developers-to-weigh-in/), or [maybe](https://arstechnica.com/gadgets/2024/03/apple-changes-course-will-keep-iphone-eu-web-apps-how-they-are-in-ios-17-4/)
* [github issue #57](https://github.com/9001/copyparty/issues/57) - config GUI
* configs given to -c can be ordered with numerical prefix
* autorevert settings if it fails to apply
* countdown until session invalidates in settings gui, with refresh-button

View file

@ -218,7 +218,7 @@ if you don't need all the features, you can repack the sfx and save a bunch of s
* `269k` after `./scripts/make-sfx.sh re no-cm no-hl` * `269k` after `./scripts/make-sfx.sh re no-cm no-hl`
the features you can opt to drop are the features you can opt to drop are
* `cm`/easymde, the "fancy" markdown editor, saves ~82k * `cm`/easymde, the "fancy" markdown editor, saves ~89k
* `hl`, prism, the syntax hilighter, saves ~41k * `hl`, prism, the syntax hilighter, saves ~41k
* `fnt`, source-code-pro, the monospace font, saves ~9k * `fnt`, source-code-pro, the monospace font, saves ~9k
* `dd`, the custom mouse cursor for the media player tray tab, saves ~2k * `dd`, the custom mouse cursor for the media player tray tab, saves ~2k

34
docs/rice/README.md Normal file
View file

@ -0,0 +1,34 @@
# custom fonts
to change the fonts in the web-UI, first save the following text (the default font-config) to a new css file, for example named `customfonts.css` in your webroot:
```css
:root {
--font-main: sans-serif;
--font-serif: serif;
--font-mono: 'scp';
}
```
add this to your copyparty config so the css file gets loaded: `--html-head='<link rel="stylesheet" href="/customfonts.css">'`
alternatively, if you are using a config file instead of commandline args:
```yaml
[global]
html-head: <link rel="stylesheet" href="/customfonts.css">
```
restart copyparty for the config change to take effect
edit the css file you made and press `ctrl`-`shift`-`R` in the browser to see the changes as you go (no need to restart copyparty for each change)
if you are introducing a new ttf/woff font, don't forget to declare the font itself in the css file; here's one of the default fonts from `ui.css`:
```css
@font-face {
font-family: 'scp';
font-display: swap;
src: local('Source Code Pro Regular'), local('SourceCodePro-Regular'), url(deps/scp.woff2) format('woff2');
}
```

View file

@ -24,7 +24,7 @@ ENV ver_asmcrypto=c72492f4a66e17a0e5dd8ad7874de354f3ccdaa5 \
# the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap # the scp url is regular latin from https://fonts.googleapis.com/css2?family=Source+Code+Pro&display=swap
RUN mkdir -p /z/dist/no-pk \ RUN mkdir -p /z/dist/no-pk \
&& wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \ && wget https://fonts.gstatic.com/s/sourcecodepro/v11/HI_SiYsKILxRpg3hIP6sJ7fM7PqlPevW.woff2 -O scp.woff2 \
&& apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev brotli py3-brotli \ && apk add cmake make g++ git bash npm patch wget tar pigz brotli gzip unzip python3 python3-dev py3-brotli \
&& rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \ && rm -f /usr/lib/python3*/EXTERNALLY-MANAGED \
&& wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \ && wget https://github.com/openpgpjs/asmcrypto.js/archive/$ver_asmcrypto.tar.gz -O asmcrypto.tgz \
&& wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \ && wget https://github.com/markedjs/marked/archive/v$ver_marked.tar.gz -O marked.tgz \
@ -143,9 +143,8 @@ RUN ./genprism.sh $ver_prism
# compress # compress
COPY brotli.makefile zopfli.makefile /z/dist/ COPY zopfli.makefile /z/dist/
RUN cd /z/dist \ RUN cd /z/dist \
&& make -j$(nproc) -f brotli.makefile \
&& make -j$(nproc) -f zopfli.makefile \ && make -j$(nproc) -f zopfli.makefile \
&& rm *.makefile \ && rm *.makefile \
&& mv no-pk/* . \ && mv no-pk/* . \

View file

@ -1,4 +0,0 @@
all: $(addsuffix .br, $(wildcard easymde*))
%.br: %
brotli -jZ $<

View file

@ -37,7 +37,7 @@ help() { exec cat <<'EOF'
# _____________________________________________________________________ # _____________________________________________________________________
# web features: # web features:
# #
# `no-cm` saves ~82k by removing easymde/codemirror # `no-cm` saves ~89k by removing easymde/codemirror
# (the fancy markdown editor) # (the fancy markdown editor)
# #
# `no-hl` saves ~41k by removing syntax hilighting in the text viewer # `no-hl` saves ~41k by removing syntax hilighting in the text viewer
@ -406,7 +406,7 @@ find -type f -name ._\* | while IFS= read -r f; do cmp <(printf '\x00\x05\x16')
rm -f copyparty/web/deps/*.full.* copyparty/web/dbg-* copyparty/web/Makefile rm -f copyparty/web/deps/*.full.* copyparty/web/dbg-* copyparty/web/Makefile
find copyparty | LC_ALL=C sort | sed -r 's/\.(gz|br)$//;s/$/,/' > have find copyparty | LC_ALL=C sort | sed -r 's/\.gz$//;s/$/,/' > have
cat have | while IFS= read -r x; do cat have | while IFS= read -r x; do
grep -qF -- "$x" ../scripts/sfx.ls || { grep -qF -- "$x" ../scripts/sfx.ls || {
echo "unexpected file: $x" echo "unexpected file: $x"
@ -603,7 +603,7 @@ sed -r 's/(.*)\.(.*)/\2 \1/' | LC_ALL=C sort |
sed -r 's/([^ ]*) (.*)/\2.\1/' | grep -vE '/list1?$' > list1 sed -r 's/([^ ]*) (.*)/\2.\1/' | grep -vE '/list1?$' > list1
for n in {1..50}; do for n in {1..50}; do
(grep -vE '\.(gz|br)$' list1; grep -E '\.(gz|br)$' list1 | (shuf||gshuf) ) >list || true (grep -vE '\.gz$' list1; grep -E '\.gz$' list1 | (shuf||gshuf) ) >list || true
s=$( (sha1sum||shasum) < list | cut -c-16) s=$( (sha1sum||shasum) < list | cut -c-16)
grep -q $s "$zdir/h" 2>/dev/null && continue grep -q $s "$zdir/h" 2>/dev/null && continue
echo $s >> "$zdir/h" echo $s >> "$zdir/h"

View file

@ -119,13 +119,13 @@ class Cfg(Namespace):
ex = "ah_cli ah_gen css_browser hist ipa_re js_browser no_forget no_hash no_idx nonsus_urls" ex = "ah_cli ah_gen css_browser hist ipa_re js_browser no_forget no_hash no_idx nonsus_urls"
ka.update(**{k: None for k in ex.split()}) ka.update(**{k: None for k in ex.split()})
ex = "hash_mt srch_time u2j" ex = "hash_mt srch_time u2abort u2j"
ka.update(**{k: 1 for k in ex.split()}) ka.update(**{k: 1 for k in ex.split()})
ex = "reg_cap s_thead s_tbody th_convt" ex = "reg_cap s_thead s_tbody th_convt"
ka.update(**{k: 9 for k in ex.split()}) ka.update(**{k: 9 for k in ex.split()})
ex = "db_act df loris re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo" ex = "db_act df k304 loris re_maxage rproxy rsp_jtr rsp_slp s_wr_slp snap_wri theme themes turbo"
ka.update(**{k: 0 for k in ex.split()}) ka.update(**{k: 0 for k in ex.split()})
ex = "ah_alg bname doctitle exit favico idp_h_usr html_head lg_sbf log_fk md_sbf name textfiles unlist vname R RS SR" ex = "ah_alg bname doctitle exit favico idp_h_usr html_head lg_sbf log_fk md_sbf name textfiles unlist vname R RS SR"