加勒比久久综合,国产精品伦一区二区,66精品视频在线观看,一区二区电影

合肥生活安徽新聞合肥交通合肥房產(chǎn)生活服務(wù)合肥教育合肥招聘合肥旅游文化藝術(shù)合肥美食合肥地圖合肥社保合肥醫(yī)院企業(yè)服務(wù)合肥法律

DDA3020代做、代寫Python語(yǔ)言編程
DDA3020代做、代寫Python語(yǔ)言編程

時(shí)間:2024-10-12  來源:合肥網(wǎng)hfw.cc  作者:hfw.cc 我要糾錯(cuò)



DDA3020 Homework 1
Due date: Oct 14, 2024
Instructions
• The deadline is 23:59, Oct 14, 2024.
• The weight of this assignment in the ffnal grade is 20%.
• Electronic submission: Turn in solutions electronically via Blackboard. Be sure to submit
 your homework as one pdf ffle plus two python scripts. Please name your solution ffles as
”DDA3020HW1 studentID name.pdf”, ”HW1 yourID Q1.ipynb” and ”HW1 yourID Q2.ipynb”.
(.py ffles also acceptable)
• Note that late submissions will result in discounted scores: 0-24 hours → 80%, 24-120 hours
→ 50%, 120 or more hours → 0%.
• Answer the questions in English. Otherwise, you’ll lose half of the points.
• Collaboration policy: You need to solve all questions independently and collaboration between
students is NOT allowed.
1 Written Problems (50 points)
1.1. (Learning of Linear Regression, 25 points) Suppose we have training data:
{(x1, y1),(x2, y2), . . . ,(xN , yN )},
where xi ∈ R
d and yi ∈ R
k
, i = 1, 2, . . . , N.
i) (9 pts) Find the closed-form solution of the following problem.
min
W,b
X
N
i=1
∥yi − Wxi − b∥
2
2
,
ii) (8 pts) Show how to use gradient descent to solve the problem. (Please state at least one
possible Stopping Criterion)
1DDA3020 Machine Learning Autumn 2024, CUHKSZ
iii) (8 pts) We further suppose that x1, x2, . . . , xN are drawn from N (µ, σ
2
). Show that the
maximum likelihood estimation (MLE) of σ
2
is σˆ
2
MLE =
1
N
PN
n=1
(xn − µMLE)
2
.
1.2. (Support Vector Machine, 25 points) Given two positive samples x1 = (3, 3)
T
, x2 =
(4, 3)
T
, and one negative sample x3 = (1, 1)
T
, ffnd the maximum-margin separating hyperplane and
support vectors.
Solution steps:
i) Formulating the Optimization Problem (5 pts)
ii) Constructing the Lagrangian (5 pts)
iii) Using KKT Conditions (5 pts)
iv) Solving the Equations (5 pts)
v) Determining the Hyperplane Equation and Support Vectors (5 pts)
2 Programming (50 points)
2.1. (Linear regression, 25 points) We have a labeled dataset D = {(x1, y1),(x2, y2),
· · · ,(xn, yn)}, with xi ∈ R
d being the d-dimensional feature vector of the i-th sample, and yi ∈ R
being real valued target (label).
A linear regression model is give by
fw0,...,wd
(x) = w0 + w1x1 + w2x2 + · · · + wdxd, (1)
where w0 is often called bias and w1, w2, . . . , wd are often called coefffcients.
Now, we want to utilize the dataset D to build a linear model based on linear regression.
We provide a training set Dtrain that includes 2024 labeled samples with 11 features (See linear
 regression train.txt) to fft model, and a test set Dtest that includes 10 unlabeled samples with
11 features (see linear regression test.txt) to estimate model.
1. Using the LinearRegression class from Sklearn package to get the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest by the model trained well. (Put
the estimation of w0, w1, . . . , w11 and these yˆ in your answers.)
2. Implementing the linear regression by yourself to obtain the bias w0 and the coefffcients
w1, w2, . . . , w11, then computing the yˆ = f(x) of test set Dtest. (Put the estimation of
w0, w1, . . . , w11 and these yˆ in your answers. It is allowed to compute the inverse of a matrix
using the existing python package.)
2DDA3020 Machine Learning Autumn 2024, CUHKSZ
(Hint: Note that for linear regression train.txt, there are 2024 rows with 12 columns where the
ffrst 11 columns are features x and the last column is target y and linear regression test.txt
only contains 10 rows with 11 columns (features). Both of two tasks require the submission of
code and results. Put all the code in a “HW1 yourID Q1.ipynb” Jupyter notebook. ffle.(”.py”
ffle is also acceptable))
2.2. (SVM, 25 points)
Task Description You are asked to write a program that constructs support vector machine
models with different kernel functions and slack variables.
Datasets You are provided with the iris dataset. The data set contains 3 classes of 50 instances
each, where each class refers to a type of iris plant. There are four features: 1. sepal length in cm;
2. sepal width in cm; 3. petal length in cm; 4. petal width in cm. You need to use these features
to classify each iris plant as one of the three possible types.
What you should do You should use the SVM function from python sklearn package, which
provides various forms of SVM functions. For multiclass SVM you should use the one vs rest
strategy. You are recommended to use sklearn.svm.svc() function. You can use numpy for vector
manipulation. For technical report, you should report the results required as mentioned below (e.g.
training error, testing error, and so on).
1. (2 points) Split training set and test set. Split the data into a training set and a test set.
The training set should contain 70% of the samples, while the test set should include 30%.
The number of samples from each category in both the training and test sets should reffect
this 70-30 split; for each category, the ffrst 70% of the samples will form the training set, and
the remaining 30% will form the test set. Ensure that the split maintains the original order
of the data. You should report instance ids in the split training set and test set. The output
format is as follows:
Q2.2.1 Split training set and test set:
Training set: xx
Test set: xx
You should ffll up xx in the template. You should write ids for each set in the same line with
comma separated, e.g. Training set:[1, 4, 19].
2. (10 points) Calculation using Standard SVM Model (Linear Kernel). Employ the
standard SVM model with a linear kernel. Train your SVM on the split training dataset and
3DDA3020 Machine Learning Autumn 2024, CUHKSZ
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, output the weight vector w, the bias b, and the indices of support vectors
(start with 0). Note that the scikit-learn package does not offer a function with hard margin,
so we will simulate this using C = 1e5. You should ffrst print out the total training error
and testing error, where the error is
wrong prediction
number of data
. Then, print out the results for each class
separately (note that you should calculate errors for each class separately in this part). You
should also mention in your report which classes are linear separable with SVM without slack.
The output format is as follows:
Q2.2.2 Calculation using Standard SVM Model:
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
Linear separable classes: xx
If we view the one vs all strategy as combining the multiple different SVM, each one being
a separating hyperplane for one class and the rest of the points, then the w, b and support
vector indices for that class is the corresponding parameters for the SVM separating this class
and the rest of the points. If a variable is of vector form, say a =


1
2
3
?**4;
?**5;?**5;?**6;, then you should write
each entry in the same line with comma separated e.g. [1,2,3].
3. (6 points) Calculation using SVM with Slack Variables (Linear Kernel). For each
C = 0.25 × t, where t = 1, 2, . . . , 4, train your SVM on the training dataset, and subsequently
validate it on the testing dataset. Calculate the classiffcation error for both the training and
testing datasets, the weight vector w, the bias b, and the indices of support vectors, and the
slack variable ζ of support vectors (you may compute it as max(0, 1 − y · f(X)). The output
format is as follows:
Q2.2.3 Calculation using SVM with Slack Variables (C = 0.25 × t, where t = 1, . . . , 4):
4DDA3020 Machine Learning Autumn 2024, CUHKSZ
-------------------------------------------
C=0.25,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
slack variable: xx,
-------------------------------------------
C=0.5,
<... results for (C=0.5) ...>
-------------------------------------------
C=0.75,
<... results for (C=0.75) ...>
-------------------------------------------
C=1,
<... results for (C=1) ...>
4. (7 points) Calculation using SVM with Kernel Functions. Conduct experiments with
different kernel functions for SVM without slack variable. Calculate the classiffcation error
for both the training and testing datasets, and the indices of support vectors for each kernel
type:
(a) 2nd-order Polynomial Kernel
(b) 3nd-order Polynomial Kernel
(c) Radial Basis Function Kernel with σ = 1
(d) Sigmoidal Kernel with σ = 1
The output format is as follows:
5DDA3020 Machine Learning Autumn 2024, CUHKSZ
Q2.2.4 Calculation using SVM with Kernel Functions:
-------------------------------------------
(a) 2nd-order Polynomial Kernel,
total training error: xx, total testing error: xx,
class setosa:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class versicolor:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
class virginica:
training error: xx, testing error: xx,
w: xx, b: xx,
support vector indices: xx,
-------------------------------------------
(b) 3nd-order Polynomial Kernel,
<... results for (b) ...>
-------------------------------------------
(c) Radial Basis Function Kernel with σ = 1,
<... results for (c) ...>
-------------------------------------------
(d) Sigmoidal Kernel with σ = 1,
<... results for (d) ...>
Submission Submit your executable code in a “HW1 yourID Q2.ipynb” Jupyter notebook(”.py”
file is also acceptable). Indicate the corresponding question number in the comment for each cell,
and ensure that your code can logically produce the required results for each question in the required
format. Please note that you need to write clear comments and use appropriate function/variable
names. Excessively unreadable code may result in point deductions.

6

請(qǐng)加QQ:99515681  郵箱:99515681@qq.com   WX:codinghelp




 

掃一掃在手機(jī)打開當(dāng)前頁(yè)
  • 上一篇:代做CS 259、Java/c++設(shè)計(jì)程序代寫
  • 下一篇:代做MSE 280、代寫Matlab程序語(yǔ)言
  • 無相關(guān)信息
    合肥生活資訊

    合肥圖文信息
    2025年10月份更新拼多多改銷助手小象助手多多出評(píng)軟件
    2025年10月份更新拼多多改銷助手小象助手多
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)/客戶要求/設(shè)計(jì)優(yōu)化
    有限元分析 CAE仿真分析服務(wù)-企業(yè)/產(chǎn)品研發(fā)
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計(jì)優(yōu)化
    急尋熱仿真分析?代做熱仿真服務(wù)+熱設(shè)計(jì)優(yōu)化
    出評(píng) 開團(tuán)工具
    出評(píng) 開團(tuán)工具
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    挖掘機(jī)濾芯提升發(fā)動(dòng)機(jī)性能
    海信羅馬假日洗衣機(jī)亮相AWE  復(fù)古美學(xué)與現(xiàn)代科技完美結(jié)合
    海信羅馬假日洗衣機(jī)亮相AWE 復(fù)古美學(xué)與現(xiàn)代
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士4號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
    合肥機(jī)場(chǎng)巴士3號(hào)線
  • 短信驗(yàn)證碼 目錄網(wǎng) 排行網(wǎng)

    關(guān)于我們 | 打賞支持 | 廣告服務(wù) | 聯(lián)系我們 | 網(wǎng)站地圖 | 免責(zé)聲明 | 幫助中心 | 友情鏈接 |

    Copyright © 2025 hfw.cc Inc. All Rights Reserved. 合肥網(wǎng) 版權(quán)所有
    ICP備06013414號(hào)-3 公安備 42010502001045

    欧美精品一二| 欧美精品97| 天天射成人网| 日韩欧美黄色| 日韩高清一区在线| av中文字幕在线观看第一页| 久久在线免费| 日韩免费精品| 欧美日本一区二区高清播放视频| 欧美激情欧美| 黑人一区二区| 精品少妇一区| 日本亚洲不卡| 欧美激情视频一区二区三区免费| 一区二区精品伦理...| 亚洲精品97| 精品久久久久久久久久久aⅴ| 一区在线不卡| 九九九精品视频| av女在线播放| 久久成人精品| 狠狠爱www人成狠狠爱综合网| 精品国产一区二区三区不卡蜜臂| 国产一区网站| 91精品国产色综合久久不卡粉嫩| 国产91亚洲精品久久久| 色999日韩| 另类图片国产| 亚洲一区久久| 中文国产一区| 午夜电影亚洲| 波多野结衣的一区二区三区| 欧洲亚洲成人| 精品理论电影| 鲁大师精品99久久久| 国产精品视频3p| 精品成人18| 伊色综合久久之综合久久| 日韩成人一区二区| 日韩成人午夜精品| 精品三级久久久| 亚洲大片精品免费| 国产欧美欧美| 奇米色欧美一区二区三区| 欧美视频精品全部免费观看| 91成人app| 一级欧美视频| 国产欧美一区| 日本中文字幕一区二区视频| 亚洲理论电影| 伊人久久影院| 精品国产一区二区三区不卡蜜臂 | 一区免费视频| 精品91久久久久| 亚洲欧美激情诱惑| 蜜臀久久99精品久久久久久9| 亚洲专区免费| 日韩aaaa| 日韩一区二区三区免费播放| 99只有精品| 日本伊人色综合网| 中文字幕亚洲精品乱码| 亚洲+变态+欧美+另类+精品| 日韩av午夜| 精品国产乱码久久久| 欧美先锋资源| 在线综合视频| 黄色亚洲网站| 国产精品第一国产精品| 国内久久精品| 日韩高清电影免费| 日韩精品一区二区三区免费观影| 在线日韩一区| 免费看亚洲片| 日韩国产欧美三级| 日韩欧美字幕| 美日韩一区二区三区| 亚洲精品欧洲| 日韩av午夜在线观看| 天堂日韩电影| 99热免费精品| 日韩欧美国产精品综合嫩v| 国产日韩欧美三区| 国产伦精品一区二区三区千人斩| 国产一区二区三区不卡av| 91成人观看| 在线一区av| 欧美日本二区| 欧美私人啪啪vps| 伊人精品在线| 精品视频在线一区二区在线| 欧美精品18| 亚洲伊人影院| 欧美伦理影院| 国产免费拔擦拔擦8x在线播放 | 亚洲ww精品| 在线国产日韩| 日韩综合一区二区三区| 婷婷综合五月| av综合电影网站| 粉嫩一区二区三区在线观看| 欧美三级自拍| 成人影视亚洲图片在线| 羞羞视频在线观看欧美| 日韩精品一区二区三区中文字幕| 自拍亚洲一区| 91精品国产66| 精品视频在线观看网站| 在线免费观看日本欧美爱情大片| 亚洲优女在线| 亚洲日本三级| 狠狠爱综合网| 国产日韩高清一区二区三区在线| 在线播放一区二区精品视频| 国产婷婷精品| 乱一区二区av| 风间由美中文字幕在线看视频国产欧美| 日韩亚洲精品在线| 久久精品国产99国产精品| 一区二区三区免费在线看| 久久午夜影视| 亚洲精品系列| 一区二区三区视频免费观看| 日韩av首页| av日韩在线播放| 蜜臀久久久99精品久久久久久| 亚洲色图国产| 在线成人直播| 久久久久97| 午夜国产一区二区| 国产精品久久久久久久久久妞妞| 国产精品久av福利在线观看| 91综合在线| 日韩av字幕| 水蜜桃久久夜色精品一区| 日韩欧美ww| 蜜臀av一级做a爰片久久| 亚洲日本免费电影| av不卡免费看| 国产精品亚洲欧美日韩一区在线 | 国产偷自视频区视频一区二区| 日本午夜精品一区二区三区电影| 亚洲午夜激情在线| 日本vs亚洲vs韩国一区三区二区| 国内成人在线| 日本女优在线视频一区二区| 亚洲天堂成人| 国产精品va| 国产一区导航| 国产欧美一区二区三区精品观看 | 亚洲国产一区二区在线观看| 老司机免费视频一区二区三区| 国产一区二区三区四区老人| 麻豆精品在线播放| 欧美特黄一级| 国产精品探花在线观看| 日本一区二区在线看| 激情五月综合婷婷| 欧美天堂一区| 免费毛片在线不卡| 欧美黄色大片网站| 成人影视亚洲图片在线| 超碰97久久| 欧美国产视频| 日韩一级免费| 日韩精品视频在线看| 四虎地址8848精品| 性xxxx欧美老肥妇牲乱| 国产一区二区三区91| 天堂а√在线最新版中文在线| 国产精品成人自拍| 久久男人av| 高清毛片在线观看| 亚洲福利一区| av在线国产精品| 欧美日韩视频网站| 免费欧美一区| 精品视频一区二区三区在线观看| 欧美性www| 尤物网精品视频| 天堂av一区| 亚洲久久在线| 日韩欧美字幕| 夜夜爽av福利精品导航| 激情视频亚洲| 国产精品观看| 日韩国产激情| 亚洲免费高清| 97视频一区| 亚洲国产精品第一区二区| 国产激情在线播放| 激情久久五月| 日本久久伊人| 欧美激情一区| 成人自拍av| 久久国产成人| 亚洲天堂成人| 成人台湾亚洲精品一区二区| 91精品在线免费视频| 国产精品第一|