Explainable AI is the hype! Or maybe not so much. But depending on the use case the AI has to be explainable.
AI is hot stuff and many tools are marketed as AI tools. But what is an AI tool and how does it differentiate from a non-AI tools?
The data engineer understands the architecture of the production system. She rewrites the notebook into an architecturally coherent unit that is being packaged as an image. This image is being deployed to the production system.
Machine learned models can be really big, like the multi-billion weight GPT models, there is a chance they contain sensitive data and their output need to be sanitized or something third. This post is on the fundamentals of a scalable ML architecture.
We need good data platforms to ingest, process, and analyze data.
The Case: As a financial institution we want to avoid fraud. Fraud is a broad term, so in this case, we focus specifically on identity theft. Ie. We want to be able to quickly detect oddities signaling that someone's identity is being used illegally.
Let an adversary add realistic data to your analysis to see if it is resilient.
One reason to not use external libraries is based on the learning outcome of the task
One should be skeptic when presented by statistics. One way to articulate well-founded skepticism is by following 3 steps of reasoning.
An MVP engineer helps scope a product, build the product to verify its business potential, set up a team and lastly scope a road map for further product development.
Object oriented programming is inherently stateful. This fits well for some applications and can confuse and reduce reliability for others.
Knowledge graphs based research give overview on vague ideas, assists knowledge discovery, and is a strong collaboration tool.
Developing solutions for problems is hard. Make it a little easier by choosing the assumptions that allows to do so.
We often don't need that much organization. Organizational structures should be afterthoughts.
Seconds brains have reached quite the hype. Both with tools like RoamResearch, Obsidian and friends. Needless to say, it is a good tool. Since I started building my second brain, I have been journalizing more often than not.
Scrapers are integral to data intensive applications. They span real development projects, and, as such, there are key architectural decisions to make.
A question is to a data scientist what a user is to an application developer.
Today I implemented Fibonacci in the Haskell type system. That means that I can get the Haskell compiler to generate a type for the n'th Fibonacci number.
Decomposing the testing process quickly shows that it consists of two components. First a specification is formulated, which is later verified.