CST 311: Week 2

During week 2 of our computer networking course, we focused on the application layer (the top layer of the five layer protocol stack that comprises Internet architecture).

In the beginning section of our reading, we were given an overview of common application layer protocols and the underlying transport layer–either TCP or UDP–associated with each. Most common application layer protocols seem to be built on top of TCP (transmission control protocol) rather than UDP (user datagram protocol).

What transport layer protocol should my app be built over?

TCP is a popular choice because it provides reliable data transfer service, providing error checking to ensure all packets of a message are received and assembled in the correct order. However, because TCP is more complex than UDP, it is more likely to suffer from latency. In some applications (e.g. telephony or media streaming), some data loss is tolerable when it means less latency. Therefore, such applications may be built over UDP rather than TCP. However, when an application cannot tolerate data loss (e.g. text messaging or file transfer) but can tolerate some latency, that application may be built over TCP.

Pushy protocols give and pully ones receive

We also learned the that application protocols can be classified as pull or push. For instance, HTTP is a pull protocol because a client uses HTTP to retrieve data from a web server. SMTP, on the other hand, is a push protocol because email clients push out data to a receiving email sever. This gives context to why we use these verbs in git.

Sorry, Mario. Your IP is in another DNS Castle

We also covered Domain Name Services (DNS). While I previously understood the function of DNS, that there are many DNS servers, and that not every one had an exhaustive list of IP addresses to map on to a given domain name, I was unaware of the extent to which DNS severs were categorized hierarchically and that requests can either be resolved iteratively or recursively.

From Kurose & Ross (2016)

P2P Architecture

One of my favorite parts of this week’s reading was about Peer-to-Peer (P2P) architecture because it made me nostalgic for when the architecture was first popularized when I was in high school circa 1999. During this time, many P2P media sharing applications were born–Napster, Morpheus, Kazaa, SoulSeek. This method of file sharing was revolutionary because anyone could be a seed (server) or a leech (client) at the same time and you did not have to have the entire file to serve chunks of data to others. P2P architecture scales much better than server-client architecture as the number of hosts increase. This is because every host becomes a server as soon as they successfully fetch some part of the target file.

CST 311: Week 1

After a short summer break, we have begun a new class in Computer Networking. I think this course will really demystify some of the concepts we glossed over in Internet Programming, particularly when it comes to making design decisions in creating a web API.

This week, we read the first chapter of Computer Networking: A Top-Down Approach (Kurose and Ross, 2016), which included a broad overview of the five-layer Internet architecture:

Credit: microchipdeveloper.com

Week 8 CST 336

For the final for this class, our team had to develop a shopping website. The requirements were:

Minimum requirements:

If any of the following elements is missing, you will get deducted 20 points.

Documentation must include Title, Description, Mockup, Database Schema and Screenshots -20pts
(if
missing any)
Project must use at least four database tables
The combined database tables must have at least ten fields
One of the database table must have at least 20 records

Feature requirements:

There is a “user” section in which users can search and filter data using at least three fields 10pts
Users can add items to a shopping cart  10pts
Users can see all items in their cart (Total cost is also displayed) 10pts
Administrators can login and logout from the system 10pts
Administrators can update content of at least one table (using pre-populated data in the form) 10pts
Administrators can insert new records in at least one table 10pts
Administrators can delete records 10pts
Administrators can generate at least three reports, which use aggregate functions (e.g., average price of all products in the table) 10pts
Project uses at least two AJAX calls with their corresponding Web APIs.
As part of your submission, please explain where the AJAX calls are.
10pts
Project has a nice and consistent design (preferably, use Bootstrap) 10pts
We met all the requirements with our project, Pontificating Monty’s Firefly Bazaar.

Week 7 CST 336

This week we learned about authentication and sessions.

One method of authentication is HTTP Basic Auth, which is just username:password. However, passwords should not be passed as plain text because it is easily intercepted and read. There are many encryption methods, but we focused on BCrypt. BCrypt is an adaptive hashing function that uses a changing key factor that makes it resistant to hacking/cracking.

We did not implement BCrypt ourselves, but we used the bcrypt package on npm to hash our passwords to implement authentication on our web apps. We also tracked users with express-sessions.

CST 336 Week 6

This week we connected our knowledge of Node.js with the skills we built in the Intro to Databases class using the node-mysql package. Of course preventing SQL injection was emphasized. If user input is not properly escaped, anyone could inject their own SQL queries and gain access to the entire database–they could even drop an entire schema worth of tables.

There are two methods for escaping user input. One involves calling the mysql.escape method from the (mysql npm package) on user input. The other method involves using a question mark placeholder for user inputted values, which are then passed in as arguments into the query.

Week 5 CST 336

This week we learned about Node.js, which is a JavaScript runtime environment. We also delved into the world of npm or the Node Package Manager, which hosts the largest public registry of packages in the world. Initializing a Node project and installing a Node package from the command line is super simple:

npm init
npm i <package-name> --save

The first line initiates an interactive Node package set-up process. This creates a file called package.json, which houses information about your newly created Node package, including:

  • name: Name of package
  • version: Version of package
  • description: Describes your package
  • homepage: URL to project homepage
  • bugs: URL to project’s issue tracker (If using GitHub, this will be your /issues path)
  • license: ISC by default
  • author: Name of developer
  • main: Primary entry point to the program
  • directories: A way of specifying the structure of the package
  • repository: URL for the package’s repo (detected automatically if it is on GitHub)
  • scripts: Dictionary containing script commands run at various times
  • dependencies: Names and versions of packages that your package depends upon

The second line installs one or more packages in the node_modules path and updates the dependencies field in package.json.

One of the packages we worked with was Express.js. Express is a Node.js web framework that exposes a variety of middleware functions that allows a developer to parse incoming HTTP request bodies that contain JSON payloads.

To create an Express application, you first have to import the Express module and then create an instance of an Express application:

var express = require('express'); // import Express module
var app = express(); // Initialize Express app

Week 4: CST 336

This week’s assignment was interesting because we basically had to refactor our basic HTML/CSS assignment from week 1 to use more advanced concepts using JavaScript on the Node.js runtime with Express and the faker package.

It became clear how much easier it would be to maintain a larger scale web project using JavaScript partials–instead of editing source code in multiple files, all you have to do is edit the associated partial!

While my submission for this week’s website assignment might not look much different from week 1’s, it is implemented using more sophisticated technology that enables it to be much more highly scalable.

Week 3 CST 336

This week we learned about how to display data from an existing Web API using AJAX, based on user’s input.

I created a website that used NASA’s Astronomy Picture of the Day (APOD) API based on a valid date provided by the user. The date is validated against the earliest date in APOD and today’s date. If the user supplies an invalid date, a dynamic warning alert comes up with the valid date range. If a valid date is entered, a success alert pops up confirming the date being fetched.

CST 336 – Week 2

Week 2 added in jQuery and Javascript. We also used imported CSS rules and Javascript functions from Bootstrap 3. I was able to make a simple game using these skills in a fairly short amount of time.

You can view it on Heroku.

I really want to spend some more time with the Bootstrap documentation and take a peak at the newer Bootstrap 4.

I actually wanted to take advantage of the 4-day weekend to either get ahead in this class or to really brush up on machine learning for a job interview I have coming up in a week, but I wound up getting sick. I took that as a sign to slow down and take it easy. Luckily enough, before I got too sick, I was able to enjoy the San Diego Zoo and Balboa Park.

CST 336 – Week 1

We have begun our course in Internet Programming. This week was a nice refresher in basic HTML and CSS. However, I did delve into some of the newer tags in HTML5 and CSS3 rules/properties that I had not dealt with before.

I am enjoying the format of this class. It seems pretty consistent–readings/videos, followed by a step-by-step lab with an extension activity, and then a final homework assignment in which we apply everything we’ve learned to something individually unique.

This week’s assignment had a stipulation–it had to be an informative website about something within the realm of computer science. As someone who has studied computational linguistics, I naturally went for Natural Language Processing.

You can view my website on Heroku!

In other news, I spent time over the weekend at Qualcomm’s Thinkabit Lab–part of Qualcomm’s philanthropy sector that brings STEM education to schools. I got to pilot some of their new curriculum that they will be bringing to high schoolers later this year.

Having fun with the Thinkabit Lab people!