We made a job scheduler and reporting tool for ourselves. And then it turned out to be too good to keep internal. So here it is. Named rpeat, because jobs repeat but it also repeats important details about your job's to you wherever you are. It is dead simple but feature rich, intuitive but powerful.
What is a JWT and why would I want to use one?
This talk will cover common use cases.
This talk is for those who want to pierce the veil of abstraction and learn how their Python code is actually executed on a computer. First we will start with a guided overview of the Python Run Time envioronment in the CPython interpreter. Next will be an overview of the builtin inspect package and how it allows for direct access to the python runtime in your own Python code. After which I will show how you leverage this knowledge in PDB.
A deep dive into what actually happens when you're interfacing with gpio pins at the hardware and register level in micropython
How does the elipses work? Let's find out.
Eve and Ray embarked on a two week experiment they're calling a Learning Sprint. 4 hours a day, 5 days a week over two weeks they set goals and executed on them. What did they learn? Did it work? What fun facts did they pick up along the way? They'll explain in their thrilling talk for all skill levels.
AttributeError: property has no setter. But... I thought this was Python!
Let's have a talk about descriptors.
I will be presenting some of my current work using Python and Django to develop an open source program for professional students. This program enables students to easily peer-review and create a functioning deck of unique Anki flash cards without any prior programming experience. Anki is currently the most sought after open source flash card creation system by professional students. However due to a learning curve (and time) required to create a quality deck, many students still do not take advantage of this resource. This program seeks to change that. Come see and hear how!
Have you wondered how gestures are loaded into a computer?
Or are you questioning if transcribing our movements is even possible?
Well, it is possible... But not without its own unique set of problems.
This talk touches on everything in the transcription process from the client side to the server side.
So how did we do it?
We used Mediapipe's machine-learning neural net library to transcribe our movements and gestures to XYZ coordinates. We then created a web crawler to transcribe all of the data from the American Sign Language (ASL) website. We are in process of normalizing this data based on angles related to one another.
The next step of the process, transcribing live gestures and movements is still in development, so let's talk about it!
This will be a tutorial based talk with a demo at the end on how to use conversational language models to make an interactive chatbot. The models that we will use are Blenderbot from Facebook Research and distilbert from google research / huggingface. We will also use Transformers from Huggingface which is an easy to use packaging and API for the above systems and also go over using speech recognition and text to speech to make interaction more fluid. At the end there will be an interactive demo.