About Me
~

Hi! I’m David Laskowski. I recently graduated with a Bachelor of Science in Computer Science from the New Jersey Institute of Technology (Spring 2025). I have a strong passion for problem-solving and building creative solutions, which I aim to pursue in a career in Software Engineering or Cybersecurity. Some of my hobbies include snowboarding, fitness, pickleball, and bowling.




Skills
~
Languages

Python

Java

C

C++

HTML

CSS

JavaScript

SQL
Kotlin

Swift
Libraries & Frameworks

TypeScript

ReactJS

BootStrap

Tailwind CSS

NextJS
Developer Tools

GitHub

Git

Jupyter

Linux

MongoDB

Active Directory

Agile

Atlassian

Sharepoint

Power Automate
Projects
~



Bowlytics
I developed Bowlytics, an iOS app for bowlers to track their performance, analyze stats, and improve over time. The app supports frame-by-frame scoring with automatic strike and spare calculations, game history, and detailed performance stats. Users can also assign bowling balls and locations to games for deeper insights. Designed with Swift and SwiftUI, Bowlytics features an adaptive UI for all iPhone models and includes full dark mode support. It’s built for both casual players and competitive league bowlers.

Peace Of Mind
For a group project, we built Peace of Mind, a full-stack therapy web application designed for college students and therapists to track mental health progress and communicate securely. The app features journaling, mental health surveys, real-time chat, therapist search, and appointment scheduling. I worked on both frontend and backend tasks using React, TypeScript, Vite, and Tailwind CSS, along with Node.js, Express, PostgreSQL, and Prisma. We also used Socket.io for real-time messaging and deployed the project using Docker and Railway.

AttendEazy
Within CS 485, my group and I worked on a full-stack project with the goal of using agile methodologies. We created a digital attendance app called AttendEazy, which allows people who teach to keep the attendance of their students. The front-end is a combination of React + Vite with tailwind CSS used for the styling. The backend uses AWS & terraform, NodeJS & Express, and MySQL for the data base. Within the website you are able to create an account, set up classes with their respective students, and view a report with charts and graphs imported from the chatJS library to keep up to date with your classroom attendance.

Space Satellite Coordinator
At NJIT's 2023 GirlHacks Hackathon, our team won the "Best Use of StreamLit" award, winning amongst 120+ participants with our web app aimed to provide real-time satellite information and solar system insights. It allows users to locate nearby satellites and explore detailed information about the planets in our solar system. Built using Python with Streamlit, it combines the functionality of APIs (StreamLit, OpenAI GPT), real-time data, and AI-driven responses to user queries about space.

Yahoo Finance Web Scraper
I developed a real-time data fetching application that scrapes and displays the most active stocks from Yahoo Finance. This project combines Python's powerful scraping capabilities with PHP for web display, integrating technologies like BeautifulSoup, MongoDB, and pymongo for data handling. It features an interactive web interface where users can view and sort stock data, showcasing my skills in both data acquisition and web development. The application stands out for its real-time data scraping and storage, offering a dynamic experience for stock market enthusiasts.

Global Temperature Analysis
My group and I completed our data science project analyzing the recorded global temperature dataset from UC Berkeley. The aim is to uncover trends and patterns in global temperatures over time using advanced statistical and machine learning techniques which were simple linear regression and Random Forest algorithm. We used Jupyter Notebooks' execuiting cells in sequence and python with the following libraries for the algorithms and visualization; pandas, numpy, sklearn, matplotlib.

Lexical Analyzer
In my Lexical Analyzer project, coded in C, the first component reads text to identify tokens, utilizing the context provided by adjacent tokens to detect and describe errors. Designed to comprehend an untyped language, it handles strings, integers, real numbers, comments, and basic conditional constructs. The second layer, a recursive descent parser, refines this process by cross-referencing tokens with grammatical structures to pinpoint syntax errors. Finally, the interpreter unifies these functionalities, executing code while verifying syntactic correctness.