Step 3 - Scaling the array. We have used min-max scaler to scale the data in the array in the range 0 to 1 which we have passed in the parameter. Then we have used fit transform to fit and transform the array according to the min max scaler. minmax_scale = preprocessing.MinMaxScaler (feature_range= (0, 1)) x_scale = minmax_scale.fit_transform Description. Rescale a variable to either z-scores with a mean of 0 and standard deviation of 1, normalized with a minimum of 0 and a maximum of 1, or to a variable computed like a z-score except use the median in place of the mean and the IQR in place of the standard deviation. Importance of Feature Scaling. ¶. Feature scaling through standardization, also called Z-score normalization, is an important preprocessing step for many machine learning algorithms. It involves rescaling each feature such that it has a standard deviation of 1 and a mean of 0. Even if tree based models are (almost) not affected by scaling feature scaling in python ( image source- by Jatin Sharma ) Examples of Algorithms where Feature Scaling matters. 1. K-Means uses the Euclidean distance measure here feature scaling matters. 2. K-Nearest-Neighbors also require feature scaling. 3. Principal Component Analysis (PCA): Tries to get the feature with maximum variance, here too The scale () function in R can be used to scale the values in a vector, matrix, or data frame. This function uses the following basic syntax: scale (x, center = TRUE, scale = TRUE) where: x: Name of the object to scale. center: Whether to subtract the mean when scaling. Default is TRUE. Feature scaling is relatively easy with Python. Note that it is recommended to split data into test and training data sets BEFORE scaling. If scaling is done before partitioning the data, the data may be scaled around the mean of the entire sample, which may be different than the mean of the test and mean of the train data. Standardization: Back to your original question, I should mention that torchvision.transform.Normalize(mean=0.5, std=0.5) doesn't transform your data such that it has mean=0.5 and std=0.5 Neither will it standardize it to mean=0, std=1. You have to measure the mean and std from your dataset. torchvision.transform.Normalize simply performs a shift-scale You can use the R function prcomp for PCA. (Note that to first scale the data, you can include scale. = TRUE to scale as part of the PCA function. Don’t forget that, to make a prediction for the new city, you’ll need to unscale the coefficients (i.e., do the scaling calculation in reverse)!) Answer: It shows that our example data consists of two numeric columns x1 and x2. Example 1: Scaling Data Frame Using scale() Function. The following R syntax shows how to standardize our example data using the scale function in R. As you can see in the following R code, we simply have to insert the name of our data frame (i.e. data) into the scale YXcgT.