分享
 
 
 

统计学习基础(The elements of statistical learning:Data mining,inference,and prediction)

统计学习基础(The elements of statistical learning:Data mining,inference,and  prediction)  点此进入淘宝搜索页搜索
  特别声明:本站仅为商品信息简介,并不出售商品,您可点击文中链接进入淘宝网搜索页搜索该商品,有任何问题请与具体淘宝商家联系。
  參考價格: 点此进入淘宝搜索页搜索
  分類: 图书,社会科学,统计学,统计方法,
  品牌: 哈斯蒂

基本信息·出版社:世界图书出版公司

·页码:533 页

·出版日期:2009年

·ISBN:7506292319/9787506292313

·条形码:9787506292313

·包装版本:1版

·装帧:平装

·开本:16

·正文语种:英语

·外文书名:The elements of statistical learning:Data mining,inference,and prediction

产品信息有问题吗?请帮我们更新产品信息。

内容简介The learning problems that we consider can be roughly categorized as either supervised or unsupervised. In supervised learning, the goal is to predict the value of an outcome measure based on a number of input measures; in unsupervised learning, there is no outcome measure, and the goal is to describe the associations and patterns among a set of input measures.

目录

Preface

1 Introduction Overview of Supervised Learning

2.1 Introduction

2.2 Variable Types and Terminology

2.3 Two Simple Approaches to Prediction: Least Squares and Nearest Neighbors

2.3.1 Linear Models and Least Squares

2.3.2 Nearest-Neighbor Methods

2.3.3 From Least Squares to Nearest Neighbors

2.4 Statistical Decision Theory

2.5 Local Methods in High Dimensions

2.6 Statistical Models, Supervised Learning and Function Approximation

2.6.1 A Statistical Model for the Joint Distribution Pr(X,Y)

2.6.2 Supervised Learning

2.6.3 Function Approximation

2.7 Structured Regression Models

2.7.1 Difficulty of the Problem

2.8 Classes of Restricted Estimators

2.8.1 Roughness Penalty and Bayesian Methods

2.8.2 Kernel Methods and Local Regression

2.8.3 Basis Functions and Dictionary Methods

2.9 Model Selection and the Bias-Variance Tradeoff

Bibliographic Notes

Exercises

3 Linear Methods for Regression

3.1 Introduction

3.2 Linear Regression Models and Least Squares

3.2.1 Example:Prostate Cancer

3.2.2 The Ganss-Markov Theorem

3.3 Multiple Regression from Simple Univariate Regression

3.3.1 Multiple Outputs

3.4 Subset Selection and Coefficient Shrinkage

3.4.1 Subset Selection

3.4.2 Prostate Cancer Data Example fContinued)

3.4.3 Shrinkage Methods

3.4.4 Methods Using Derived Input Directions

3.4.5 Discussion:A Comparison of the Selection and Shrinkage Methods

3.4.6 Multiple Outcome Shrinkage and Selection

3.5 Compntational Considerations

Bibliographic Notes

Exercises

4 Linear Methods for Classification

4.1 Introduction

4.2 Linear Regression of an Indicator Matrix

4.3 Linear Discriminant Analysis

4.3.1 Regularized Discriminant Analysis

4.3.2 Computations for LDA

4.3.3 Reduced-Rank Linear Discriminant Analysis

4.4 Logistic Regression

4.4.1 Fitting Logistic Regression Models

4.4.2 Example:South African Heart Disease

4.4.3 Quadratic Approximations and Inference

4.4.4 Logistic Regression or LDA7

4.5 Separating Hyper planes

4.5.1 Rosenblatt's Perceptron Learning Algorithm

4.5.2 Optimal Separating Hyper planes

Bibliographic Notes

Exercises

5 Basis Expansions and Regularizatlon

5.1 Introduction

5.2 Piecewise Polynomials and Splines

5.2.1 Natural Cubic Splines

5.2.2 Example: South African Heart Disease (Continued)

5.2.3 Example: Phoneme Recognition

5.3 Filtering and Feature Extraction

5.4 Smoothing Splines

5.4.1 Degrees of Freedom and Smoother Matrices

5.5 Automatic Selection of the Smoothing Parameters

5.5.1 Fixing the Degrees of Freedom

5.5.2 The Bias-Variance Tradeoff

5.6 Nonparametric Logistic Regression

5.7 Multidimensional Splines

5.8 Regularization and Reproducing Kernel Hilbert Spaces . .

5.8.1 Spaces of Phnctions Generated by Kernels

5.8.2 Examples of RKHS

5.9 Wavelet Smoothing

5.9.1 Wavelet Bases and the Wavelet Transform

5.9.2 Adaptive Wavelet Filtering

Bibliographic Notes

Exercises

Appendix: Computational Considerations for Splines

Appendix: B-splines

Appendix: Computations for Smoothing Splines

6 Kernel Methods

6.1 One-Dimensional Kernel Smoothers

6.1.1 Local Linear Regression

6.1.2 Local Polynomial Regression

6.2 Selecting the Width of the Kernel

6.3 Local Regression in Jap

6.4 Structured Local Regression Models in ]ap

6.4.1 Structured Kernels

6.4.2 Structured Regression Functions

6.5 Local Likelihood and Other Models

6.6 Kernel Density Estimation and Classification

6.6.1 Kernel Density Estimation

6.6.2 Kernel Density Classification

6.6.3 The Naive Bayes Classifier

6.7 Radial Basis Functions and Kernels

6.8 Mixture Models for Density Estimation and Classification

6.9 Computational Considerations

Bibliographic Notes

Exercises

7 Model Assessment and Selection

7.1 Introduction

7.2 Bias, Variance and Model Complexity

7.3 The Bias-Variance Decomposition

7.3.1 Example: Bias-Variance Tradeoff

7.4 Optimism of the Training Error Rate

7.5 Estimates of In-Sample Prediction Error

7.6 The Effective Number of Parameters

7.7 The Bayesian Approach and BIC

7.8 Minimum Description Length

7.9 Vapnik Chernovenkis Dimension

7.9.1 Example (Continued)

7.10 Cross-Validation

7.11 Bootstrap Methods

7.11.1 Example (Continued)

Bibliographic Notes

Exercises

8 Model Inference and Averaging

8.1 Introduction

8.2 The Bootstrap and Maximum Likelihood Methods

8.2.1 A Smoothing Example

8.2.2 Maximum Likelihood Inference

8.2.3 Bootstrap versus Maximum Likelihood

8.3 Bayesian Methods

8.4 Relationship Between the Bootstrap and Bayesian Inference

8.5 The EM Algorithm

8.5.1 Two-Component Mixture Model

8.5.2 The EM Algorithm in General

8.5.3 EM as a Maximization-Maximization Procedure

8.6 MCMC for Sampling from the Posterior

8.7 Bagging

8.7.1 Example: Trees with Simulated Data

8.8 Model Averaging and Stacking

8.9 Stochastic Search: Bumping

Bibliographic Notes

Exercises

9 Additive Models, Trees, and Related Methods

9.1 Generalized Additive Models

9.1.1 Fitting Additive Models

9.1.2 Example: Additive Logistic Regression

9.1.3 Summary

9.2 Tree Based Methods

10 Boosting and Additive Trees

11 Neural Networks

12 Support Vector Machines and Flexible Discriminants

13 Prototype Methods and Nearest-Neighbors

14 Unsupervised Learning

References

Author Index

Index

……[看更多目录]

序言The field of Statistics is constantly challenged by the problems that science and industry brings to its door. In the early days, these problems often came from agricultural and industrial experiments and were relatively small in scope. With the advent of computers and the information age, statistical problems have exploded both in size and complexity. Challenges in the areas of data storage, organization and searching have led to the new field of "data mining"; statistical and computational problems in biology and medicine have created "bioinformatics." Vast amounts of data are being generated in many fields, and the statistician's job is to make sense of it all: to extract important patterns and trends, and understand "what the data says." We call this learning from data.

The challenges in learning from data have led to a revolution in the statistical sciences. Since computation plays such a key role, it is not surprising that much of this new development has been done by researchers in other fields such as computer science and engineering.

The learning problems that we consider can be roughly categorized as either supervised or unsupervised. In supervised learning, the goal is to predict the value of an outcome measure based on a number of input measures; in unsupervised learning, there is no outcome measure, and the goal is to describe the associations and patterns among a set of input measures.

文摘插图:

统计学习基础(The elements of statistical learning:Data mining,inference,and  prediction)

 
 
免责声明:本文为网络用户发布,其观点仅代表作者个人观点,与本站无关,本站仅提供信息存储服务。文中陈述内容未经本站证实,其真实性、完整性、及时性本站不作任何保证或承诺,请读者仅作参考,并请自行核实相关内容。
2023年上半年GDP全球前十五强
 百态   2023-10-24
美众议院议长启动对拜登的弹劾调查
 百态   2023-09-13
上海、济南、武汉等多地出现不明坠落物
 探索   2023-09-06
印度或要将国名改为“巴拉特”
 百态   2023-09-06
男子为女友送行,买票不登机被捕
 百态   2023-08-20
手机地震预警功能怎么开?
 干货   2023-08-06
女子4年卖2套房花700多万做美容:不但没变美脸,面部还出现变形
 百态   2023-08-04
住户一楼被水淹 还冲来8头猪
 百态   2023-07-31
女子体内爬出大量瓜子状活虫
 百态   2023-07-25
地球连续35年收到神秘规律性信号,网友:不要回答!
 探索   2023-07-21
全球镓价格本周大涨27%
 探索   2023-07-09
钱都流向了那些不缺钱的人,苦都留给了能吃苦的人
 探索   2023-07-02
倩女手游刀客魅者强控制(强混乱强眩晕强睡眠)和对应控制抗性的关系
 百态   2020-08-20
美国5月9日最新疫情:美国确诊人数突破131万
 百态   2020-05-09
荷兰政府宣布将集体辞职
 干货   2020-04-30
倩女幽魂手游师徒任务情义春秋猜成语答案逍遥观:鹏程万里
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案神机营:射石饮羽
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案昆仑山:拔刀相助
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案天工阁:鬼斧神工
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案丝路古道:单枪匹马
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案镇郊荒野:与虎谋皮
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案镇郊荒野:李代桃僵
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案镇郊荒野:指鹿为马
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案金陵:小鸟依人
 干货   2019-11-12
倩女幽魂手游师徒任务情义春秋猜成语答案金陵:千金买邻
 干货   2019-11-12
 
推荐阅读
 
 
>>返回首頁<<
 
 
靜靜地坐在廢墟上,四周的荒凉一望無際,忽然覺得,淒涼也很美
© 2005- 王朝網路 版權所有