stz is e280's standard library of environment-agnostic typescript tools. zero dependencies.
ergonomic event emitters
import {pub, sub} from "@e280/stz"- make a publisher fn
// create a pub fn const sendMessage = pub<[string]>() // subscribe to it sendMessage.subscribe(m => console.log(m)) // publish to it sendMessage("hello")
- make a subscriber fn β it's just like pub, except it's flipsy-reversey!
// create a sub fn const onMessage = sub<[string]>() // subscribe to it onMessage(m => console.log(m)) // publish to it onMessage.publish("hello")
- pub and sub both have the same facilities
.publish.subscribe.on.next.clear
- i seem to use
submore often
- publish actually returns a promise, to wait for all async subscribers
await onMessage.publish("hello")
- subscribe returns a fn to unsubscribe
const unsubscribe = onMessage(() => {}) unsubscribe()
.clear()to wipe all subscribed listenersonMessage.clear()
.next(fn?)is a better way to do .once..- you can use it like a .once:
onMessage.next(message => {})
- but it also gives you a promise like this:
const [message] = await onMessage.next()
- of course the promise can be used like this:
onMessage.next().then(([message]) => {})
- you can use it like a .once:
defer the resolve/reject of a promise to the outside
import {defer} from "@e280/stz"
const deferred = defer()- resolve the deferred promise
deferred.resolve()
- reject the deferred promise
deferred.reject(new Error("fail"))
- await the promise
await deferred.promise
sleep for some milliseconds
import {nap} from "@e280/stz"
await nap(900)
// wait for 900 millisecondsit's just sugar for
Promise.all
import {all} from "@e280/stz"
await all(
nap(500),
Promise.resolve("hello"),
fetch("whatever.json"),
)sugar for
Promise.all, but returns named things as an object
import {concurrent} from "@e280/stz"
cons {slept, hello, whatever} = await concurrent({
slept: nap(500),
hello: Promise.resolve("hello"),
whatever: fetch("whatever.json"),
})easy trash management
import {disposer} from "@e280/stz"- create a disposer
const dispose = disposer()
- schedule something for cleanup
dispose.schedule(() => console.log("disposed!"))
- schedule multiple things at once
dispose.schedule( () => console.log("disposed thing 1"), () => console.log("disposed thing 2"), () => ev(window, {keydown: () => console.log("keydown")}), )
- schedule is chainable if you prefer that vibe
dispose .schedule(() => console.log("disposed thing 1")) .schedule(() => console.log("disposed thing 2")) .schedule(() => ev(window, {keydown: () => console.log("keydown")}))
- dispose of all that garbage
dispose()
extended js data types
extended js Map
- many are saying it's "The Deluxe Mapping Experience"
import {GMap} from "@e280/stz" const map = new GMap<number, string>([ [1, "hello"], [2, "world"], ])
map.require(key)β returns the value for key.. if missing, throw an errorconst value = map.require(1) // "hello"
map.guarantee(key, makeFn)β returns the value forkey.. if missing, runmakeFnto set and return the valueconst value = map.guarantee(3, () => "rofl") // "rofl"
extended js Set
new GSet<T>()set.adds(item1, item2, item3)β add multiple items without a for-loopset.deletes(item1, item2, item3)β add multiple items without a for-loop
extended js WeakMap
new GWeakMap<K, V>()weakMap.require(key)β returns value for key.. if missing, throw an errorweakMap.guarantee(key, makeFn)β returns the value for key.. if missing, runmakeFnto set and return the value
execute calls in sequence (not concurrent)
import {queue, nap} from "@e280/stz"
const fn = queue(async() => nap(100))
fn()
fn()
await fn() // waits for the previous calls (sequentially)ensure a fn is only executed one time
import {once} from "@e280/stz"
let count = 0
const fn = once(() => count++)
console.log(count) // 0
fn()
console.log(count) // 1
fn()
console.log(count) // 1throws an error if the async function takes too long
import {deadline} from "@e280/stz"
const fn = deadline(100, async() => {
// example deliberately takes too long
await nap(200)
})
await fn()
// DeadlineError: deadline exceeded (0.1 seconds)wait some time before actually executing the fn (absorbing redundant calls)
we use debounce a lot in ui code, like on a user's keyboard input in a form field, but rendering the form input can actually be slow enough that it causes problems when they type fast β to eliminate the jank, we debounce with like 400 ms, so we wait for the user to finish typing for a moment before actually running the validation.
import {debounce} from "@e280/stz"
const fn = debounce(100, async() => {
await coolAction()
})
// each fn() call resets the timer
fn()
fn()
fn()
// coolAction is only called once here, other calls are redundantcollapse multiple calls into a single call (uses queueMicrotask under the hood)
it's like debounce(0, fn) but more efficient by using queueMicrotask instead of setTimeout
import {microbounce} from "@e280/stz"
const fn = microbounce(async() => coolAction())
fn()
fn()
fn() // previous calls are redundantexecute a function over and over again, back to back
import {cycle} from "@e280/stz"
let ticks = 0
const stop = cycle(async() => {
// use a nap to add a delay between each execution
await nap(200)
ticks++
})
// stop repeating whenever you want
stop()convert to/from utf8 string format
txt.fromBytes(bytes)β bytes to stringtxt.toBytes(string)β string to bytes
utilities for dealing with Uint8Array
bytes.eq(bytesA, bytesB)β check if two byte arrays are equalbytes.random(32)β generate crypto-random bytes
convert binary data to/from various encodings
import {hex, base58, base64} from "@e280/stz"all BaseX utilities have these methods
hex(u8array)β encode bytes to string (alias for hex.FromBytes)hex.fromBytes(u8array)β encode bytes to stringhex.toBytes(str)β decode string to byteshex.toInteger(string)β decode string as js integerhex.fromInteger(n)β encode js integer as a stringhex.random(32)β generate random encoded string (32 bytes)
hexbase2base36base58base62base64base64url
- you can provide a
lexiconto produce your own BaseX codecconst myHex = new BaseX({characters: "0123456789abcdef"})
- fun fact: you can make insanely compact timestamp strings like this:
base62.fromInteger(Date.now() / 1000) // "1uK3au"
1748388028base10 epoch seconds (10 chars)1uK3aubase62 epoch seconds (6 chars)- nice
friendly string encoding for binary data
a bytename looks like "midsen.picmyn.widrep.baclut dotreg.filtyp.nosnus.siptev". that's 16 bytes. each byte maps to a three-letter triplet
the bytename parser (bytename.toBytes) ignores all non-alphabetic characters. thus midsen.picmyn, midsenpicmyn, and mid@sen$pic@myn are all equal.
import {bytename} from "@e280/stz"-
bytename.fromBytes(new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF])) // "ribmug.hilmun"
-
bytename.toBytes("ribmug.hilmun") // Uint8Array, 4 bytes
-
const data = new Uint8Array([ 0xDE, 0xAD, 0xBE, 0xEF, 0xDE, 0xAD, 0xBE, 0xEF, ]) bytename.fromBytes(data, { groupSize: 2, // default is 4 groupSeparator: " ", wordSeparator: ".", }) // "ribmug.hilmun ribmug.hilmun"
hybrid of bytename and base58 to make binary data more human-friendly
- looks like
nodlyn.fasrep.habbud.ralwel.Avo7gFmdWMRHkwsD149mcaBoZdS69iXuJ - the idea is that the first parts are in bytename format, so it's easy for humans to recognize
- and the remaining data is shown in base58
thumbprint.fromBytes(u8array)β encode bytes to thumbprint stringthumbprint.toBytes(thumbstring)β decode thumbprint string to bytesthumbprint.fromHex(hexstring)β convert a hex string into a thumbprintthumbprint.toHex(thumbstring)β convert a thumbprint into a hex string
tar-like binary file format for efficiently packing multiple files together
import {toq, txt} from "@e280/stz"- 4 magic bytes
"TOQ\x01" - for each file (little endian)
name length1 byte (u8)namex bytes (max 255 B)data length4 bytes (u32)datax bytes (max 4 GB)
- toq.pack β accepts any iterable of file entries
const pack: Uint8Array = toq.pack([ ["hello.txt", txt.toBytes("hello world")], ["deadbeef.data", new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF])], ])
- toq.is β check if a file is a toq pack or not
toq.is(pack) // true
- toq.unpack β generator fn yields file entries
for (const [name, data] of toq.unpack(pack)) console.log(name, data.length)
- pack a map of files
const files = new Map<string, Uint8Array>() files.set("hello.txt", txt.toBytes("hello world")) files.set("deadbeef.data", new Uint8Array([0xDE, 0xAD, 0xBE, 0xEF])) const pack = toq.pack(files)
- unpack into a new map
const files = new Map(toq.unpack(pack))
reward us with github stars
build with us at https://e280.org/ but only if you're cool