Noise Filtering for Big Data Analytics / / ed. by Souvik Bhattacharyya, Koushik Ghosh.

This book explains how to perform data de-noising, in large scale, with a satisfactory level of accuracy. Three main issues are considered. Firstly, how to eliminate the error propagation from one stage to next stages while developing a filtered model. Secondly, how to maintain the positional import...

Full description

Saved in:
Bibliographic Details
Superior document:Title is part of eBook package: De Gruyter DG Plus DeG Package 2022 Part 1
MitwirkendeR:
HerausgeberIn:
Place / Publishing House:Berlin ;, Boston : : De Gruyter, , [2022]
©2022
Year of Publication:2022
Language:English
Series:De Gruyter Series on the Applications of Mathematics in Engineering and Information Sciences , 12
Online Access:
Physical Description:1 online resource (VIII, 156 p.)
Tags: Add Tag
No Tags, Be the first to tag this record!
Description
Other title:Frontmatter --
Preface --
Contents --
About the Editors --
Application of discrete domain wavelet filter for signal denoising --
Secret sharing scheme in defense and big data analytics --
Recent advances in digital image smoothing: A review --
Double exponential smoothing and its tuning parameters: A re-exploration --
Effect of smoothing on big data governed by polynomial memory --
Heteroskedasticity in panel data: A big challenge to data filtering --
Importance and use of digital filters in digital image processing --
Smart filter and smoothing: A new approach of data denoising --
Acknowledgement --
Index
Summary:This book explains how to perform data de-noising, in large scale, with a satisfactory level of accuracy. Three main issues are considered. Firstly, how to eliminate the error propagation from one stage to next stages while developing a filtered model. Secondly, how to maintain the positional importance of data whilst purifying it. Finally, preservation of memory in the data is crucial to extract smart data from noisy big data. If, after the application of any form of smoothing or filtering, the memory of the corresponding data changes heavily, then the final data may lose some important information. This may lead to wrong or erroneous conclusions. But, when anticipating any loss of information due to smoothing or filtering, one cannot avoid the process of denoising as on the other hand any kind of analysis of big data in the presence of noise can be misleading. So, the entire process demands very careful execution with efficient and smart models in order to effectively deal with it.
Format:Mode of access: Internet via World Wide Web.
ISBN:9783110697216
9783110766820
9783110993899
9783110994810
9783110994223
9783110994193
ISSN:2626-5427 ;
DOI:10.1515/9783110697216
Access:restricted access
Hierarchical level:Monograph
Statement of Responsibility: ed. by Souvik Bhattacharyya, Koushik Ghosh.