Featured
Following
3 hours ago•••
AI is only as useful as the amount of high quality inputs you train it on
most inputs are garbage
the good stuff gets lost and usually those training the AI are not experts in the material they are feeding it, and can't curate it well enough
people using AI to do things are only ever going to get mediocre results for everything, and the only way the quality will improve is if more people produce quality stuff, but that doesn't seem to be on the agenda
when AIs are being trained on stuff that mostly AIs generated it's gonna be a circus
4 hours ago•••
having fun at the fiatmine today
i sorta had an inkling there was something missing from the workflow of my colleagues building the smart contracts, middleware data cache index server and front end that i am now literally having to fill a gap for
they don't have a mock data generator scheme for their index to enable testing the stuff in between the front end and the middleware index database cache
i already had written mock stuff, because i can't be testing any kind of credible statistical analysis algorithm without a large quantity of data, both to test the algorithm and to evaluate its meeting the requirements it has
it's dead boring work, making mocks, and there is a bit of a twinge of a perception that i'm not working fast enough at doing the job, and i think the thing is they literally don't realise that i don't have data to build my big data analysis code for
i've dropped this information into the dev chat about this, specifically that i have to build a mock to simulate a live system to make sure that my sync to the database is adequate
so, i think that's gonna be a point i'm gonna nudge at any time i have incidents like what the CTO did a few weeks back, complaining about how i am not working fast enough... i'll just say "there was no testing framework to mock a running system and i have to build that as well"
it's boring tho, lol, man this is boring, it's so much more interesting building things that barely have to be online and referred to in my profile and bam i have 50 users pushing data to my relay, means i can test it in the wild and avoid a lot of this tedium
but as @npub1l5s...gx9z would say, we still need to have some kind of a decent test framework anyhow
well i guess i'm getting some time to develop my skills in building mocks, omg it's so boring tho lol man
14 hours ago•••
#realy #devstr #progressreport
after some fighting with the huma api to play nice with my easy-registering router - mainly was just realising that it should match on path prefixes not the whole path for the huma servemux, i finally had the HTTP API presenting
first things first, i wanted to move more stuff into dynamic configuration stored in the database, and first thing to do is enable configuration of the admins in order to restrict access to the admin parts of the API
i haven't fully finished it yet but it starts up wide open now, and you have to use the /configuration/set endpoint and drop at least one npub into the Admins field, and voila, it is now locked down
i have to first start by adding just one configuration to the environment, which is the initial password, which you put in the Authorization header field to allow access, this ensures that the relay is not open to the world should it be deployed on some far distant VPS that can be spidered by nasty people who might figure out quickly how to break it on you
once those two pieces are in place, i need to put back the nip-98 expiration variant generator tool, and then you can use this token temporarily to auth as an administrator, and tinker with the other admin functions, but mainly the configuration is what is most important priority
so, a nice afternoon's work, dragging a bit into the evening, but i got my nice router library working with huma API and based on the code from the original realy i will reinstate the whole functionality of it pretty quickly... and likely along the way i will probably find something to make a bit better, but i think overall it's fine as it is, it's just a bit clunky to use the export function in the scalar REST docs UI but with my nip-98 capable curl tool, nurl, you can just use that and basta
now is time for bed tho
#gn
LOAD OLDER THREADS