合肥生活安徽新闻合肥交通合肥房产生活服务合肥教育合肥招聘合肥旅游文化艺术合肥美食合肥地图合肥社保合肥医院企业服务合肥法律

代写CS444 Linear classifiers

时间:2024-02-29  来源:合肥网hfw.cc  作者:hfw.cc 我要纠错


Assignment 1: Linear classifiers

Due date: Thursday, February 15, 11:59:59 PM

 

In this assignment you will implement simple linear classifiers and run them on two different datasets:

1. Rice dataset: a simple categorical binary classification dataset. Please note that the

labels in the dataset are 0/1, as opposed to -1/1 as in the lectures, so you may have to change either the labels or the derivations of parameter update rules accordingly.

2. Fashion-MNIST: a multi-class image classification dataset

The goal of this assignment is to help you understand the fundamentals of a few classic methods and become familiar with scientific computing tools in Python. You will also get experience in hyperparameter tuning and using proper train/validation/test data splits.

Download the starting code here.

You will implement the following classifiers (in their respective files):

1. Logistic regression (logistic.py)

2. Perceptron (perceptr on.py)

3. SVM (svm.py)

4. Softmax (softmax.py)

For the logistic regression classifier, multi-class prediction is difficult, as it requires a one-vs-one or one-vs-rest classifier for every class. Therefore, you only need to use logistic regression on the Rice dataset.

The top-level notebook (CS 444 Assignment-1.ipynb) will guide you through all of the steps.

Setup instructions are below. The format of this assignment is inspired by the Stanford

CS231n assignments, and we have borrowed some of their data loading and instructions in our assignment IPython notebook.

None of the parts of this assignment require the use of a machine with a GPU. You may complete the assignment using your local machine or you may use Google Colaboratory.

Environment Setup (Local)

If you will be completing the assignment on a local machine then you will need a Python environment set up with the appropriate packages.

We suggest that you use Anaconda to manage Python package dependencies

(https://www.anaconda.com/download). This guide provides useful information on how to use Conda: https://conda.io/docs/user-guide/getting-started.html.

Data Setup (Local)

Once you have downloaded and opened the zip file, navigate to the fashion-mnist directory in assignment1 and execute the get_datasets script provided:

$ cd assignment1/fashion-mnist/

$ sh get_data.sh or $bash get_data.sh

The Rice dataset is small enough that we've included it in the zip file.

Data Setup (For Colaboratory)

If you are using Google Colaboratory for this assignment, all of the Python packages you need will already be installed. The only thing you need to do is download the datasets and make them available to your account.

Download the assignment zip file and follow the steps above to download Fashion-MNIST to your local machine. Next, you should make a folder in your Google Drive to holdall of   your assignment files and upload the entire assignment folder (including the datasets you downloaded) into this Google drive file.

You will now need to open the assignment 1 IPython notebook file from your Google Drive folder in Colaboratory and run a few setup commands. You can find a detailed tutorial on   these steps here (no need to worry about setting up GPU for now). However, we have

condensed all the important commands you need to run into an IPython notebook.

IPython

The assignment is given to you in the CS 444 Assignment-1.ipynb file. As mentioned, if you are   using Colaboratory, you can open the IPython notebook directly in Colaboratory. If you are using a local machine, ensure that IPython is installed (https://ipython.org/install.html). You may then navigate to the assignment directory in the terminal and start a local IPython server using the jupyter notebook command.

Submission Instructions

Submission of this assignment will involve three steps:

1. If you are working in a pair, only one designated student should make the submission to Canvas and Kaggle. You should indicate your Team Name on Kaggle Leaderboard   and team members in the report.

2. You must submit your output Kaggle CSV files from each model on the Fashion- MNIST dataset to their corresponding Kaggle competition webpages:

  Perceptron

  SVM

  Softmax

The baseline accuracies you should approximately reach are listed as benchmarks on each respective Kaggle leaderboard.

3. You must upload three files on Canvas:

1. All of your code (Python files and ipynb file) in a single ZIP file. The filename should benetid_mp1_code.zip. Do NOT include datasets in your zip file.

2. Your IPython notebook with output cells converted to PDF format. The filename should benetid_mp1_output.pdf.

3. A brief report in PDF format using this template. The filename should be netid_mp1_report.pdf.

Don'tforget to hit "Submit" after uploadingyour files,otherwise we will not receive your submission!

Please refer to course policies on academic honesty, collaboration, late submission, etc.
请加QQ:99515681  邮箱:99515681@qq.com   WX:codehelp 

扫一扫在手机打开当前页
  • 上一篇:莆田鞋在哪买:介绍十个最新购买渠道
  • 下一篇:代写5614. C++ PROGRAMMING
  • 无相关信息
    合肥生活资讯

    合肥图文信息
    新能源捕鱼一体电鱼竿好用吗
    新能源捕鱼一体电鱼竿好用吗
    海信罗马假日洗衣机亮相AWE  复古美学与现代科技完美结合
    海信罗马假日洗衣机亮相AWE 复古美学与现代
    合肥机场巴士4号线
    合肥机场巴士4号线
    合肥机场巴士3号线
    合肥机场巴士3号线
    合肥机场巴士2号线
    合肥机场巴士2号线
    合肥机场巴士1号线
    合肥机场巴士1号线
    合肥轨道交通线路图
    合肥轨道交通线路图
    合肥地铁5号线 运营时刻表
    合肥地铁5号线 运营时刻表
  • 币安app官网下载 短信验证码

    关于我们 | 打赏支持 | 广告服务 | 联系我们 | 网站地图 | 免责声明 | 帮助中心 | 友情链接 |

    Copyright © 2024 hfw.cc Inc. All Rights Reserved. 合肥网 版权所有
    ICP备06013414号-3 公安备 42010502001045