I Love Lucy and her Low IQ 🐒
I Love Lucy and her Low IQ 🐒
DEFI payment-after-delivery service with QR asset tracking on Polkadot's substrate barcode scanner.
Make it Reign. Buy now, pay later. Our wallet has analytics for operations and tracking to using a user friendly interface, used as a crypto wallets with shopping rewards and options for payment-after-delivery in addition to having security measures like a recovery phrase verification. With Reign, customers can sign up and easily apply for credit. They will be given $250 check out with Reign asset and the integrated crypto wallet here they can choose a repayment plan. The value proposition is multiple tiers the default tier is for payments over three months, all with different spending limit. It rewards customers for their loyalty to online shopping per our Reign. Privacy collection at users have signed what we do is their consumer pattern target ads consumers will get better recommendations personalized merchants will get more consumers. Direct users paying the monthly account fee [$8/month] until repayment is paid in full (interest-free) in addition to administration fees. Advertising revenue from consumer shopping data collected from the mobile app used for analytics. Direct ads based on geography, gender, age and spending habits as one of three revenue sources. REIGN is built on Rust with Polkadot and Substrate where blockchains are called Parachains. Polkadot Parachain here from the right is running parallel transactions from all chains in the network at the same time. This allows chain asset exchanges of tokens an account balance this optimizes for scale so for instance the matter of the large amount of metadata blocks. Substrate-based blockchain nodes expose a number of capabilities like networking, consensus, and RPC server for node interactions. This is comparable to Visa benchmarks where Polkadot Parachain is 61% faster, in addition to having forkless and future proof upgrades.
Large Hadron Collider is the world's largest and highest-energy particle collider with the ATLAS detector generating 1 petabyte of raw data and 40 million packets of protons colliding every second. High collision rate 20 MHz means not all events can be stored. A particle physics trigger system selects specific events. The goal is to reduce the size of the data by engineering a compression algorithm for the trigger system.
Autoencoder neural networks were used for data compression and anomaly detection. The two part encoder and decoder system compresses hadron jet event data from 4 to 3 variables. It was trained over a dataset to encode the inputs into a smaller memory space with PyTorch, FastAI Library, ROOT Data Analysis Framework, and CERN ATLAS Docker images. Analysis includes plots and graphs to explain the concepts of invariant mass, purity selection, trigger efficiency, and hadron event reconstruction.
Issues with funding distribution in science includes billions of dollars wasted on redundant research due to replication of data. Scientists collect large amounts of valuable data everyday but data is not shared. SCI4ALL is a decentralized decision making for scientists and researchers to collaborate and be paid fairly for their work. Scientists submit grant proposals and stake holders vote for their preferred choice. Each research project is associated with an ETH address with incentive tokens for data sharing. The blockchain-based system improves crediting/funding allocations among researchers.
Developed with Truffle, Solidity, Node, Web3.js and Ganache for ETHWaterloo. SCI4ALL verifies the authenticity of data with "time stamps" with immutability for collaborative science on the blockchain allowing for tamper-proof history of recording research contributions in transactions. The Ethereum smart contract environment guarantees protection of intellectual property rights while facilitating the exchange of valuable research results with peer reviewing and the publication of papers.
How to collect and work with large amounts of medical data needed in a quantum computing model system to help optimize the cancer research approach? The use of personalized medicine seeks to resolve/attenuate this through consideration of individual variations such as differences in their genome, proteome, and exposome (takes into consideration their lifestyle, health history, etc..). The use of traditional machine learning has helped the progression of precision medicine but is coming to its limit as the number of health related variables increase. We can use quantum computing to aid this cause, and emphasize the interplay of health risk factors in the progression of disease. Developed a Quantum Support Vector Machine (QSVM) for Anomaly Detection which translates the classical medical data into quantum states. The kernel-based SVM is expressed as a quadratic programming (QP) problem where energy function was minimized using Quadratic Unconstrained Binary Optimization (QUBO) and the discrete binary solution solved on D-Wave 2000Q Quantum Annealer. The quantum optimization algorithm solves the problem in logarithmic order compared to polynomic in classical making it more efficient with the increase in the number of qubits with lower computational cost.
Covid Control is machine learning optimization model that predicts the 7-day moving average of Covid19 cases using sequential decision making. The first predictor model is a LSTM model trained on Oxford dataset. 8 parameters used to train the model included schools closing, workplace closing, cancel public events, restrictions on gatherings, close public transport, stay at home requirements, restrictions on internal movements travel between regions/cities, international travel controls. The previous 21 days of values for the 8 NPIs were then fed into the action input for the prescription. The second prescription model uses a reinforcement learning agent to minimize the number of cases and outputs a set of actions. The weights for each country were drawn from a uniform distribution within [0,1] and normalized to sum up to one. The predictions were averaged over a time period of 180 day to obtain the final objective number of covid cases for each country. This allowed reinforcement learning agents to make future prediction states of their environment by estimating the system state. In reward function search, the output of the predictor was the reward vector for Q value steps. This helps with intervention planning measures like lockdowns, social distancing, or the mandatory use of face masks.
Published developer with multiple Amazon Alexa applications on the Alexa Skills Store. Low Apps integrated with Voice User Interface (VUI) and In-Skill Purchasing (ISP). ISP allows developers to enrich in-skill experiences, drive deeper customer engagements.
Data Analysis with Python and Jupyter Notebook on Udacity Student Engagement data with CSV data cleansing on files enrollments.csv, daily_engagement.csv, project_submissions.csv. Correlations between Udacity project completion rates and the following factors: minutes engaged, lessons completed, and days visiting the classroom.
Developed a Node.JS application using the Word Javascript API for Hack Princeton. User records audio from their browser and the data is automatically parsed into Microsoft Word using the JS add-on.
Data Analysis with Python and Jupyter Notebook on New York Subway and Weather Data. Compared NYC subway map to find high ridership numbers in clusters from longitude from - 73.95 to -74.03 and latitude from 40.70 to 40.79.
Developed a Neural Network Aimbot for FPS games with custom training mode written in C++ providing a fast and efficient framework with scripting support. Includes customizable predictions and dynamic speed settings. Recognizes game objects in a certain range, then aims at the objects using game physics by hooking into the FPS game engine to use game data to auto-aim without altering gaming files.
Data Analysis with Python and Jupyter Notebook on Gapminder data with information on employment rates (%), life expectancy (years), GDP/capita (US$ and inflation adjusted), primary school completion (% of boys), or primary school completion (% of girls) data collection.