Projects

Current

Save 58

Inspiration: Kurzgesagt – In a Nutshell: This Video Might Save 58 Lives Next Week

Hypothesis: Can an automated message (email, tweet, slack, video) save lives?

Premise: Can providing statistically likely causes for death or injury for a specific person's demographic (age, gender, profession) combined with current conditions (date, geography, weather) result in that person being mindful of the particular danger to save their life? Pros:

  • This is a deceptively hard problem involving multi dimensional. big data. It could easily keep intern teams occupied for several years
  • Decomposable. Its possible to focus this project into smaller, month long deliverables that are challenging.
  • Motivational mission. It feels good to do some good.
  • Potentially marketable as doing good

Cons:

  • Delicate topics for the work place. See the video.
  • Not profitable without turning people into the product.

Impact Map

Business Value

  • Data science tooling
    • iceburg
    • developing a model
  • Data/ETL
    • repeatable data scraping
    • automated data evaluation
  • New technologies
    • sveltekit
    • nix/flake
    • tauri
    • multi dimension database
    • automated web scrapping
    • biome

Next steps

  • Week 0
    1. Evaluate data
    2. Boot strap project
      • Repo
        • monorepo
        • think about directory structure for data, server, client
      • Tooling: direnv / nix flake <- Daniel or Kyle?
      • Typescript/Sveltekit
        • tooling/one off scripts can be polyglot
      • Linter/formatter - biome
      • Postgresql docker?
      • CI - github actions
      • README
    3. Management
      • Assign primary mentors
        • pick days in office
      • Laptops - MCD acting
      • github/slack access
    4. Impact map/Stories
      • Review notes
      • Need more challenges
  • Week 1
    • The missing semester
      • Review the first six chapters
      • Command-line Environment and Version Control (Git) will be really helpful
      • Ask lots of questions
    • link 4cs
    • Tool setup
      • Checkout github repo. Work through README
      • Ask fellow mojos! #internship #engineering
    • First commit?
    • Discovery
  • Week 2 - atomic commits / unraveling git commit
    • auth
    • Implement boilerplate design? Jesse has a headless UI library he likes
      • Anonymous user landing page
      • Authenticated user page
      • privacy policy
    • Scraping/Data collection
      • Aggregate data
      • Automate import into database
    • unit test / some web output for

      Week before day light savings time, "start shifting your schedule by 15 minutes earlier/later. heart attacks and car crashes increase after day light savings time due to sleep deprivation"

    • video parity
      • 3 million people will watch this video
        • 15-35
        • living in western countries
      • driving
        • 8 in car crash next week. 416 of you over the next year. 2m29s
          • 2: 30% from speeding
          • 2: 25% from drinking and driving
          • 1: from distracted
          • 3: from not wearing a seat belt
        • 26 by falling in the next year. 5m17s
          • scaffolding, ladders, hiking
        • 1 of you will drown next week. 6m07s
          • over estimate swimming abilities
          • go into the water drunk
          • cruise ships
        • 10 of you will die from self harm next week 7m12s
          • crisis situations triggered by traumatic events and extraordinary situations
          • resources for help
        • 5 of you will from cancer next week 8m52s
          • regular checkups and screenings
          • sunscreen
      • sharing
  • Week 3
    • Ticket work.
  • Week 4
    • Ticket work.
Shallow vs deep

Shallow is a first step to get something working. Deep is a thorough tech investigation

Shallow
  • Json / csv in git
  • Manual algorithm
  • Web app only
  • Manual notification trigger via web
  • Manual data scrapping
Deep
  • Apache iceburg for storage sharing. Data lake. <- Eric Gibb
  • Multi dimension data in postgresql / vector storage
  • Desktop/mobile version via Tauri
  • Worker queue for pre calculating and notifications. Different queue/worker than previous mojo work not redis/sidekiq/bullmq
  • Automated LLMs scrapping
Thoughts

We should have a very clear privacy policy

Landing on the page should engage the user with data driven from form. This can lead towards creating an account to provide more specific information on a schedule.

Dog Fooding. Plugging in "Software Engineer" + "Remote" + "No standing desk" = "make sure to walk around every hour"

Living in downtown Manhatten on a Sunday morning, "be careful slicing the bagel this morning. X% of ER visits from from bagel slices today"

Week before day light savings time, "start shifting your schedule by 15 minutes earlier/later. heart attacks and car crashes increase after day light savings time due to sleep deprivation"

Needs to be friendly and not anxiety inducing. Carson mentioned the Citizen app as the antithesis.

Wireframe

Landing

Investigation

Legacy

Helios

Helios is our weather/welcome display for the Providence and Boulder front door. It serves as an opportunity to welcome guests and show off some our technical expertise.

StandupHub

StandupHub is a web service to easily track your tasks and their statuses for standup.

Stack

Startup

cd standuphub-api-ex # start a pgsl instance on localhost:5432. You can use an OS service instead of docker docker-compose up db # start the phoenix server on localhost:4000 connected to localhost for database mix phx.server cd ../standuphub-web # start webpack server on localhost:3000 pointed to the local phoenix server the API server REACT_APP_API_SERVER=http://localhost:4000 yarn start