94 results (0,22008 seconds)

Brand

Merchant

Price (EUR)

Reset filter

Products
From
Shops

Equivalence and Noninferiority Tests for Quality Manufacturing and Test Engineers

Bayesian Designs for Phase I-II Clinical Trials

Basic Analysis I Functions of a Real Variable

Handbook of Alternative Data in Finance Volume I

Performance Reliability and Availability Evaluation of Computational Systems Volume I Performance and Background

Performance Reliability and Availability Evaluation of Computational Systems Volume I Performance and Background

This textbook intends to be a comprehensive and substantially self-contained two-volume book covering performance reliability and availability evaluation subjects. The volumes focus on computing systems although the methods may also be applied to other systems. The first volume covers Chapter 1 to Chapter 14 whose subtitle is ``Performance Modeling and Background. The second volume encompasses Chapter 15 to Chapter 25 and has the subtitle ``Reliability and Availability Modeling Measuring and Workload and Lifetime Data Analysis. This text is helpful for computer performance professionals for supporting planning design configuring and tuning the performance reliability and availability of computing systems. Such professionals may use these volumes to get acquainted with specific subjects by looking at the particular chapters. Many examples in the textbook on computing systems will help them understand the concepts covered in each chapter. The text may also be helpful for the instructor who teaches performance reliability and availability evaluation subjects. Many possible threads could be configured according to the interest of the audience and the duration of the course. Chapter 1 presents a good number of possible courses programs that could be organized using this text. Volume I is composed of the first two parts besides Chapter 1. Part I gives the knowledge required for the subsequent parts of the text. This part includes six chapters. It covers an introduction to probability descriptive statistics and exploratory data analysis random variables moments covariance some helpful discrete and continuous random variables Taylor series inference methods distribution fitting regression interpolation data scaling distance measures and some clustering methods. Part II presents methods for performance evaluation modeling such as operational analysis Discrete-Time Markov Chains (DTMC) and Continuous Time Markov Chains (CTMC) Markovian queues Stochastic Petri nets (SPN) and discrete event simulation. | Performance Reliability and Availability Evaluation of Computational Systems Volume I Performance and Background

GBP 120.00
1

Foundations of Quantitative Finance Book I: Measure Spaces and Measurable Functions

Foundations of Quantitative Finance Book I: Measure Spaces and Measurable Functions

This is the first in a set of 10 books written for professionals in quantitative finance. These books fill the gap between informal mathematical developments found in introductory materials and more advanced treatments that summarize without formally developing the important foundational results professionals need. Book I in the Foundations in Quantitative Finance Series develops topics in measure spaces and measurable functions and lays the foundation for subsequent volumes. Lebesgue and then Borel measure theory are developed on ℝ motivating the general extension theory of measure spaces that follows. This general theory is applied to finite product measure spaces Borel measures on ℝn and infinite dimensional product probability spaces. The overriding goal of these books is a complete and detailed development of the many mathematical theories and results one finds in popular resources in finance and quantitative finance. Each book is dedicated to a specific area of mathematics or probability theory with applications to finance that are relevant to the needs of professionals. Practitioners academic researchers and students will find these books valuable to their career development. All ten volumes are extensively self-referenced. The reader can enter the collection at any point or topic of interest and then work backward to identify and fill in needed details. This approach also works for a course or self-study on a given volume with earlier books used for reference. Advanced quantitative finance books typically develop materials with an eye to comprehensiveness in the given subject matter yet not with an eye toward efficiently curating and developing the theories needed for applications in quantitative finance. This book and series of volumes fill this need. | Foundations of Quantitative Finance Book I: Measure Spaces and Measurable Functions

GBP 68.99
1

Nonparametric Statistical Tests A Computational Approach

Nonparametric Statistical Tests A Computational Approach

Nonparametric Statistical Tests: A Computational Approach describes classical nonparametric tests as well as novel and little-known methods such as the Baumgartner-Weiss-Schindler and the Cucconi tests. The book presents SAS and R programs allowing readers to carry out the different statistical methods such as permutation and bootstrap tests. The author considers example data sets in each chapter to illustrate methods. Numerous real-life data from various areas including the bible and their analyses provide for greatly diversified reading. The book covers: Nonparametric two-sample tests for the location-shift model specifically the Fisher-Pitman permutation test the Wilcoxon rank sum test and the Baumgartner-Weiss-Schindler test Permutation tests location-scale tests tests for the nonparametric Behrens-Fisher problem and tests for a difference in variability Tests for the general alternative including the (Kolmogorov-)Smirnov test ordered categorical and discrete numerical data Well-known one-sample tests such as the sign test and Wilcoxon’s signed rank test a modification suggested by Pratt (1959) a permutation test with original observations and a one-sample bootstrap test are presented. Tests for more than two groups the following tests are described in detail: the Kruskal-Wallis test the permutation F test the Jonckheere-Terpstra trend test tests for umbrella alternatives and the Friedman and Page tests for multiple dependent groups The concepts of independence and correlation and stratified tests such as the van Elteren test and combination tests The applicability of computer-intensive methods such as bootstrap and permutation tests for non-standard situations and complex designs Although the major development of nonparametric methods came to a certain end in the 1970s their importance undoubtedly persists. What is still needed is a computer assisted evaluation of their main properties. This book closes that gap. | Nonparametric Statistical Tests A Computational Approach

GBP 69.99
1

Beyond First Order Model Theory Volume I and II

Beyond First Order Model Theory Volume I and II

Model theory is the meta-mathematical study of the concept of mathematical truth. After Afred Tarski coined the term Theory of Models in the early 1950’s it rapidly became one of the central most active branches of mathematical logic. In the last few decades ideas that originated within model theory have provided powerful tools to solve problems in a variety of areas of classical mathematics including algebra combinatorics geometry number theory and Banach space theory and operator theory. The two volumes of Beyond First Order Model Theory present the reader with a fairly comprehensive vista rich in width and depth of some of the most active areas of contemporary research in model theory beyond the realm of the classical first-order viewpoint. Each chapter is intended to serve both as an introduction to a current direction in model theory and as a presentation of results that are not available elsewhere. All the articles are written so that they can be studied independently of one another. The first volume is an introduction to current trends in model theory and contains a collection of articles authored by top researchers in the field. It is intended as a reference for students as well as senior researchers. This second volume contains introductions to real-valued logic and applications abstract elementary classes and applications interconnections between model theory and function spaces nonstucture theory and model theory of second-order logic. Features A coherent introduction to current trends in model theory. Contains articles by some of the most influential logicians of the last hundred years. No other publication brings these distinguished authors together. Suitable as a reference for advanced undergraduate postgraduates and researchers. Material presented in the book (e. g abstract elementary classes first-order logics with dependent sorts and applications of infinitary logics in set theory) is not easily accessible in the current literature. The various chapters in the book can be studied independently. | Beyond First Order Model Theory Volume I and II

GBP 230.00
1

Sequential Analysis Hypothesis Testing and Changepoint Detection

Sequential Analysis Hypothesis Testing and Changepoint Detection

Sequential Analysis: Hypothesis Testing and Changepoint Detection systematically develops the theory of sequential hypothesis testing and quickest changepoint detection. It also describes important applications in which theoretical results can be used efficiently. The book reviews recent accomplishments in hypothesis testing and changepoint detection both in decision-theoretic (Bayesian) and non-decision-theoretic (non-Bayesian) contexts. The authors not only emphasize traditional binary hypotheses but also substantially more difficult multiple decision problems. They address scenarios with simple hypotheses and more realistic cases of two and finitely many composite hypotheses. The book primarily focuses on practical discrete-time models with certain continuous-time models also examined when general results can be obtained very similarly in both cases. It treats both conventional i. i. d. and general non-i. i. d. stochastic models in detail including Markov hidden Markov state-space regression and autoregression models. Rigorous proofs are given for the most important results. Written by leading authorities in the field this book covers the theoretical developments and applications of sequential hypothesis testing and sequential quickest changepoint detection in a wide range of engineering and environmental domains. It explains how the theoretical aspects influence the hypothesis testing and changepoint detection problems as well as the design of algorithms. | Sequential Analysis Hypothesis Testing and Changepoint Detection

GBP 44.99
1

Theory of Statistical Inference

Theory of Statistical Inference

Theory of Statistical Inference is designed as a reference on statistical inference for researchers and students at the graduate or advanced undergraduate level. It presents a unified treatment of the foundational ideas of modern statistical inference and would be suitable for a core course in a graduate program in statistics or biostatistics. The emphasis is on the application of mathematical theory to the problem of inference leading to an optimization theory allowing the choice of those statistical methods yielding the most efficient use of data. The book shows how a small number of key concepts such as sufficiency invariance stochastic ordering decision theory and vector space algebra play a recurring and unifying role. The volume can be divided into four sections. Part I provides a review of the required distribution theory. Part II introduces the problem of statistical inference. This includes the definitions of the exponential family invariant and Bayesian models. Basic concepts of estimation confidence intervals and hypothesis testing are introduced here. Part III constitutes the core of the volume presenting a formal theory of statistical inference. Beginning with decision theory this section then covers uniformly minimum variance unbiased (UMVU) estimation minimum risk equivariant (MRE) estimation and the Neyman-Pearson test. Finally Part IV introduces large sample theory. This section begins with stochastic limit theorems the δ-method the Bahadur representation theorem for sample quantiles large sample U-estimation the Cramér-Rao lower bound and asymptotic efficiency. A separate chapter is then devoted to estimating equation methods. The volume ends with a detailed development of large sample hypothesis testing based on the likelihood ratio test (LRT) Rao score test and the Wald test. Features This volume includes treatment of linear and nonlinear regression models ANOVA models generalized linear models (GLM) and generalized estimating equations (GEE). An introduction to decision theory (including risk admissibility classification Bayes and minimax decision rules) is presented. The importance of this sometimes overlooked topic to statistical methodology is emphasized. The volume emphasizes throughout the important role that can be played by group theory and invariance in statistical inference. Nonparametric (rank-based) methods are derived by the same principles used for parametric models and are therefore presented as solutions to well-defined mathematical problems rather than as robust heuristic alternatives to parametric methods. Each chapter ends with a set of theoretical and applied exercises integrated with the main text. Problems involving R programming are included. Appendices summarize the necessary background in analysis matrix algebra and group theory.

GBP 99.99
1

Quantum Computation

Quantum Computation

Quantum Computation presents the mathematics of quantum computation. The purpose is to introduce the topic of quantum computing to students in computer science physics and mathematics who have no prior knowledge of this field. The book is written in two parts. The primary mathematical topics required for an initial understanding of quantum computation are dealt with in Part I: sets functions complex numbers and other relevant mathematical structures from linear and abstract algebra. Topics are illustrated with examples focussing on the quantum computational aspects which will follow in more detail in Part II. Part II discusses quantum information quantum measurement and quantum algorithms. These topics provide foundations upon which more advanced topics may be approached with confidence. Features A more accessible approach than most competitor texts which move into advanced research-level topics too quickly for today's students. Part I is comprehensive in providing all necessary mathematical underpinning particularly for those who need more opportunity to develop their mathematical competence. More confident students may move directly to Part II and dip back into Part I as a reference. Ideal for use as an introductory text for courses in quantum computing. Fully worked examples illustrate the application of mathematical techniques. Exercises throughout develop concepts and enhance understanding. End-of-chapter exercises offer more practice in developing a secure foundation.

GBP 74.99
1

Statistical Inference Based on Divergence Measures

Statistical Inference Based on Divergence Measures

The idea of using functionals of Information Theory such as entropies or divergences in statistical inference is not new. However in spite of the fact that divergence statistics have become a very good alternative to the classical likelihood ratio test and the Pearson-type statistic in discrete models many statisticians remain unaware of this powerful approach. Statistical Inference Based on Divergence Measures explores classical problems of statistical inference such as estimation and hypothesis testing on the basis of measures of entropy and divergence. The first two chapters form an overview from a statistical perspective of the most important measures of entropy and divergence and study their properties. The author then examines the statistical analysis of discrete multivariate data with emphasis is on problems in contingency tables and loglinear models using phi-divergence test statistics as well as minimum phi-divergence estimators. The final chapter looks at testing in general populations presenting the interesting possibility of introducing alternative test statistics to classical ones like Wald Rao and likelihood ratio. Each chapter concludes with exercises that clarify the theoretical results and present additional results that complement the main discussions. Clear comprehensive and logically developed this book offers a unique opportunity to gain not only a new perspective on some standard statistics problems but the tools to put it into practice.

GBP 44.99
1

Handbook of Educational Measurement and Psychometrics Using R

Quantitative Methods for Traditional Chinese Medicine Development

Quantitative Methods for Traditional Chinese Medicine Development

A Western-Based Approach to Analyzing TCMsIn recent years many pharmaceutical companies and clinical research organizations have been focusing on the development of traditional Chinese (herbal) medicines (TCMs) as alternatives to treating critical or life-threatening diseases and as pathways to personalized medicine. Quantitative Methods for Traditional Chinese Medicine Development is the first book entirely devoted to the design and analysis of TCM development from a Western perspective i. e. evidence-based clinical research and development. The book provides not only a comprehensive summary of innovative quantitative methods for developing TCMs but also a useful desk reference for principal investigators involved in personalized medicine. Written by one of the world’s most prominent biostatistics researchers the book connects the pharmaceutical industry regulatory agencies and academia. It presents a state-of-the-art examination of the subject for:Scientists and researchers who are engaged in pharmaceutical/clinical research and development of TCMsThose in regulatory agencies who make decisions in the review and approval process of TCM regulatory submissions Biostatisticians who provide statistical support to assess clinical safety and effectiveness of TCMs and related issues regarding quality control and assurance as well as to test for consistency in the manufacturing processes for TCMsThis book covers all of the statistical issues encountered at various stages of pharmaceutical/clinical development of a TCM. It explains regulatory requirements; product specifications and standards; and various statistical techniques for evaluation of TCMs validation of diagnostic procedures and testing consistency. It also contains an entire chapter of case studies and addresses critical issues in TCM development and FAQs from a

GBP 59.99
1

Basic Statistics and Pharmaceutical Statistical Applications

Computer Systems Architecture

Computer Systems Architecture

Computer Systems Architecture provides IT professionals and students with the necessary understanding of computer hardware. It addresses the ongoing issues related to computer hardware and discusses the solutions supplied by the industry. The book describes trends in computing solutions that led to the current available infrastructures tracing the initial need for computers to recent concepts such as the Internet of Things. It covers computers’ data representation explains how computer architecture and its underlying meaning changed over the years and examines the implementations and performance enhancements of the central processing unit (CPU). It then discusses the organization hierarchy and performance considerations of computer memory as applied by the operating system and illustrates how cache memory significantly improves performance. The author proceeds to explore the bus system algorithms for ensuring data integrity input and output (I/O) components methods for performing I/O various aspects relevant to software engineering and nonvolatile storage devices such as hard drives and technologies for enhancing performance and reliability. He also describes virtualization and cloud computing and the emergence of software-based systems’ architectures. Accessible to software engineers and developers as well as students in IT disciplines this book enhances readers’ understanding of the hardware infrastructure used in software engineering projects. It enables readers to better optimize system usage by focusing on the principles used in hardware systems design and the methods for enhancing performance.

GBP 44.99
1

Survival Analysis with Interval-Censored Data A Practical Approach with Examples in R SAS and BUGS

Survival Analysis with Interval-Censored Data A Practical Approach with Examples in R SAS and BUGS

Survival Analysis with Interval-Censored Data: A Practical Approach with Examples in R SAS and BUGS provides the reader with a practical introduction into the analysis of interval-censored survival times. Although many theoretical developments have appeared in the last fifty years interval censoring is often ignored in practice. Many are unaware of the impact of inappropriately dealing with interval censoring. In addition the necessary software is at times difficult to trace. This book fills in the gap between theory and practice. Features:-Provides an overview of frequentist as well as Bayesian methods. Include a focus on practical aspects and applications. Extensively illustrates the methods with examples using R SAS and BUGS. Full programs are available on a supplementary website. The authors:Kris Bogaerts is project manager at I-BioStat KU Leuven. He received his PhD in science (statistics) at KU Leuven on the analysis of interval-censored data. He has gained expertise in a great variety of statistical topics with a focus on the design and analysis of clinical trials. Arnošt Komárek is associate professor of statistics at Charles University Prague. His subject area of expertise covers mainly survival analysis with the emphasis on interval-censored data and classification based on longitudinal data. He is past chair of the Statistical Modelling Society and editor of Statistical Modelling: An International Journal. Emmanuel Lesaffre is professor of biostatistics at I-BioStat KU Leuven. His research interests include Bayesian methods longitudinal data analysis statistical modelling analysis of dental data interval-censored data misclassification issues and clinical trials. He is the founding chair of the Statistical Modelling Society past-president of the International Society for Clinical Biostatistics and fellow of ISI and ASA. | Survival Analysis with Interval-Censored Data A Practical Approach with Examples in R SAS and BUGS

GBP 44.99
1

Grid Computing Techniques and Applications

Grid Computing Techniques and Applications

Designed for senior undergraduate and first-year graduate students Grid Computing: Techniques and Applications shows professors how to teach this subject in a practical way. Extensively classroom-tested it covers job submission and scheduling Grid security Grid computing services and software tools graphical user interfaces workflow editors and Grid-enabling applications. The book begins with an introduction that discusses the use of a Grid computing Web-based portal. It then examines the underlying action of job submission using a command-line interface and the use of a job scheduler. After describing both general Internet security techniques and specific security mechanisms developed for Grid computing the author focuses on Web services technologies and how they are adopted for Grid computing. He also discusses the advantages of using a graphical user interface over a command-line interface and presents a graphical workflow editor that enables users to compose sequences of computational tasks visually using a simple drag-and-drop interface. The final chapter explains how to deploy applications on a Grid. The Grid computing platform offers much more than simply running an application at a remote site. It also enables multiple geographically distributed computers to collectively obtain increased speed and fault tolerance. Illustrating this kind of resource discovery this practical text encompasses the varied and interconnected aspects of Grid computing including how to design a system infrastructure and Grid portal. Supplemental Web ResourcesThe author’s Web site offers various instructional resources including slides and links to software for programming assignments. Many of these assignments do not require access to a Grid platform. Instead the author provides step-by-step instructions for installing open-source software to deploy and test Web and Grid services a Grid computing workflow editor to design and test workflows and a Grid computing portal to deploy portlets. | Grid Computing Techniques and Applications

GBP 69.99
1

Linux The Textbook Second Edition

Linux The Textbook Second Edition

Choosen by BookAuthority as one of BookAuthority's Best Linux Mint Books of All TimeLinux: The Textbook Second Edition provides comprehensive coverage of the contemporary use of the Linux operating system for every level of student or practitioner from beginners to advanced users. The text clearly illustrates system-specific commands and features using Debian-family Debian Ubuntu and Linux Mint and RHEL-family CentOS and stresses universal commands and features that are critical to all Linux distributions. The second edition of the book includes extensive updates and new chapters on system administration for desktop stand-alone PCs and server-class computers; API for system programming including thread programming with pthreads; virtualization methodologies; and an extensive tutorial on systemd service management. Brand new online content on the CRC Press website includes an instructor’s workbook test bank and In-Chapter exercise solutions as well as full downloadable chapters on Python Version 3. 5 programming ZFS TC shell programming advanced system programming and more. An author-hosted GitHub website also features updates further references and errata. Features New or updated coverage of file system sorting regular expressions directory and file searching file compression and encryption shell scripting system programming client-server–based network programming thread programming with pthreads and system administration Extensive in-text pedagogy including chapter objectives student projects and basic and advanced student exercises for every chapter Expansive electronic downloads offer advanced content on Python ZFS TC shell scripting advanced system programming internetworking with Linux TCP/IP and many more topics all featured on the CRC Press website Downloadable test bank work book and solutions available for instructors on the CRC Press website Author-maintained GitHub repository provides other resources such as live links to further references updates and errata | Linux The Textbook Second Edition

GBP 38.99
1

Practical Multivariate Analysis

Handbook of Statistics in Clinical Oncology

Handbook of Statistics in Clinical Oncology

Many new challenges have arisen in the area of oncology clinical trials. New cancer therapies are often based on cytostatic or targeted agents which pose new challenges in the design and analysis of all phases of trials. The literature on adaptive trial designs and early stopping has been exploding. Inclusion of high-dimensional data and imaging techniques have become common practice and statistical methods on how to analyse such data have been refined in this area. A compilation of statistical topics relevant to these new advances in cancer research this third edition of Handbook of Statistics in Clinical Oncology focuses on the design and analysis of oncology clinical trials and translational research. Addressing the many challenges that have arisen since the publication of its predecessor this third edition covers the newest developments involved in the design and analysis of cancer clinical trials incorporating updates to all four parts: Phase I trials: Updated recommendations regarding the standard 3 + 3 and continual reassessment approaches along with new chapters on phase 0 trials and phase I trial design for targeted agents. Phase II trials: Updates to current experience in single-arm and randomized phase II trial designs. New chapters include phase II designs with multiple strata and phase II/III designs. Phase III trials: Many new chapters include interim analyses and early stopping considerations phase III trial designs for targeted agents and for testing the ability of markers adaptive trial designs cure rate survival models statistical methods of imaging as well as a thorough review of software for the design and analysis of clinical trials. Exploratory and high-dimensional data analyses: All chapters in this part have been thoroughly updated since the last edition. New chapters address methods for analyzing SNP data and for developing a score based on gene expression data. In addition chapters on risk calculators and forensic bioinformatics have been added. Accessible to statisticians and oncologists interested in clinical trial methodology the book is a single-source collection of up-to-date statistical approaches to research in clinical oncology.

GBP 52.99
1

Reproducible Research with R and RStudio

Reproducible Research with R and RStudio

Praise for previous editions:Gandrud has written a great outline of how a fully reproducible research project should look from start to finish with brief explanations of each tool that he uses along the way… Advanced undergraduate students in mathematics statistics and similar fields as well as students just beginning their graduate studies would benefit the most from reading this book. Many more experienced R users or second-year graduate students might find themselves thinking ‘I wish I’d read this book at the start of my studies when I was first learning R!’…This book could be used as the main text for a class on reproducible research … (The American Statistician) Reproducible Research with R and R Studio Third Edition brings together the skills and tools needed for doing and presenting computational research. Using straightforward examples the book takes you through an entire reproducible research workflow. This practical workflow enables you to gather and analyze data as well as dynamically present results in print and on the web. Supplementary materials and example are available on the author’s website. New to the Third Edition Updated package recommendations examples URLs and removed technologies no longer in regular use. More advanced R Markdown (and less LaTeX) in discussions of markup languages and examples. Stronger focus on reproducible working directory tools. Updated discussion of cloud storage services and persistent reproducible material citation. Added discussion of Jupyter notebooks and reproducible practices in industry. Examples of data manipulation with Tidyverse tibbles (in addition to standard data frames) and pivot_longer() and pivot_wider() functions for pivoting data. Features Incorporates the most important advances that have been developed since the editions were published Describes a complete reproducible research workflow from data gathering to the presentation of results Shows how to automatically generate tables and figures using R Includes instructions on formatting a presentation document via markup languages Discusses cloud storage and versioning services particularly Github Explains how to use Unix-like shell programs for working with large research projects | Reproducible Research with R and RStudio

GBP 56.99
1

Mathematical Modeling in Biology A Research Methods Approach

Promoting Statistical Practice and Collaboration in Developing Countries

Promoting Statistical Practice and Collaboration in Developing Countries

Rarely but just often enough to rebuild hope something happens to confound my pessimism about the recent unprecedented happenings in the world. This book is the most recent instance and I think that all its readers will join me in rejoicing at the good it seeks to do. It is an example of the kind of international comity and collaboration that we could and should undertake to solve various societal problems. This book is a beautiful example of the power of the possible. [It] provides a blueprint for how the LISA 2020 model can be replicated in other fields. Civil engineers or accountants or nurses or any other profession could follow this outline to share expertise and build capacity and promote progress in other countries. It also contains some tutorials for statistical literacy across several fields. The details would change of course but ideas are durable and the generalizations seem pretty straightforward. This book shows every other profession where and how to stand in order to move the world. I urge every researcher to get a copy! —David Banks from the Foreword Promoting Statistical Practice and Collaboration in Developing Countries provides new insights into the current issues and opportunities in international statistics education statistical consulting and collaboration particularly in developing countries around the world. The book addresses the topics discussed in individual chapters from the perspectives of the historical context the present state and future directions of statistical training and practice so that readers may fully understand the challenges and opportunities in the field of statistics and data science especially in developing countries. Features • Reference point on statistical practice in developing countries for researchers scholars students and practitioners • Comprehensive source of state-of-the-art knowledge on creating statistical collaboration laboratories within the field of data science and statistics • Collection of innovative statistical teaching and learning techniques in developing countries Each chapter consists of independent case study contributions on a particular theme that are developed with a common structure and format. The common goal across the chapters is to enhance the exchange of diverse educational and action-oriented information among our intended audiences which include practitioners researchers students and statistics educators in developing countries.

GBP 105.00
1