Personal Projects
Things I built on my own time, mostly to scratch an itch from work.
Salesforce Data Loader works fine for a one-off import. For repeated migration work where you're constantly tweaking mappings and re-running the same job, it gets annoying fast. sf-bulker reads everything from a JSON config file.
- Config in config.json, versioned alongside your project
- No Java, just Node.js 18+
- OAuth 2.0 client credentials, no browser needed
- rowTransform functions let you clean or remap data before upload
- Outputs timestamped success/failure CSVs
Connects to a Salesforce org using OAuth client credentials and subscribes to any Platform Event. Payloads get saved to timestamped files. Setup is just a .env file.
- Subscribes from the earliest available event
- Event path configurable via env var
- Output is pretty-printed JSON, one file per minute
Give it a CSV, get back Salesforce metadata ready to deploy. Handles CustomObject field definitions and CustomMetadata records.
- npm run fields generates a CustomObject XML with field definitions
- npm run cm generates one CustomMetadata file per CSV row
- Supports Text, Currency, Number, Date, Picklist, MultiselectPicklist
Open a .object-meta.xml file, run SPRR: Create report from the Command Palette, and get an interactive table showing all RecordTypes and their enabled picklist values.
- Dropdown to select any picklist field by API name
- One column per RecordType
- Adapts to VS Code light and dark mode
- No files written to disk
The name is short for "translation hate". Clients work in Excel, Salesforce works with .stf files. thate converts in both directions.
- thate excel converts .stf to .xlsx
- thate stf converts .xlsx back to .stf
- --omit flag strips already-translated values from the Excel output
- Config file to set paths and filter unwanted entries