Photo
Research Paper Structure
From my observation, the research papers that I have read generally follow the same format structure. Aside from the title and authors on the topmost part of the first page, the proponents state the abstract of their study in the front page. It is where they give the readers an overview of the content of the paper. The Introduction follows the Abstract where the researchers layout the fundamental details and principles that the reader primarily needs to know, and may include the descriptions, definitions, and components of the study. The researches then detail the procedures conducted in the study, along with the specifications that come along with the process. After the authors have done the experimentation phase on their claims, the results are discussed, interpreted, and analyzed in the Results and Discussion section of the paper. To wrap things up, and to discuss the possible future endeavors in the field of study, the proponents then conclude their study along with their observations and predictions for the future.
New Concepts
Activated Sludge
The activated sludge is a complex ecosystem mainly of bacteria and protozoa. In fact, a good balance between the bacteria and protozoa and among the different species of each group is crucial for efficient pollution removal, good settleability properties and low suspended solids effluent levels. Bacteria agglomerate as aggregates, mainly due to exopolymers excretion and a filamentous bacteria backbone towards an ideal floc with reasonable size and well balanced floc-forming and filamentous bacteria composition. Activated sludge flocs are made up of microbial colonies embedded in a cloud of extracellular polymeric substances (EPS). The EPS are produced by the microorganisms either by cell lysis or by active transport. Other floc constituents are organic fibres, adsorbed organic particles from the wastewater and inorganic components. It is believed that the EPS play a large role in the formation of sludge flocs since they constitute the major organic fraction of the sludge. The EPS are typically made up of protein, humic substances, carbohydrates, nucleic acids and lipids.
Source:
Wilén, B., Lumley, D., Mattsson, A., & Mino, T. (2008). Relationship between floc composition and flocculation and settling properties studied at a full scale activated sludge plant. Water Research,42(16), 4404-4418. doi:10.1016/j.watres.2008.07.033
Amaral, A., & Ferreira, E. (2005). Activated sludge monitoring of a wastewater treatment plant using image analysis and partial least squares regression. Analytica Chimica Acta,544(1-2), 246-253. doi:10.1016/j.aca.2004.12.061
Filamentous Bulking
Sludge bulking occurs when the sludge fails to separate out in the sedimentation tanks. Bulking sludge, a term used to describe the excessive growth of filamentous bacteria, is a common problem in activated sludge process. The term bulking sludge often also is used for non-filamentous poor settling, but in this study it refers only to filamentous sludge.
Source: Martins, A. M., Pagilla, K., Heijnen, J. J., & Loosdrecht, M. C. (2004). Filamentous bulking sludge—a critical review. Water Research,38(4), 793-817. doi:10.1016/j.watres.2003.11.005
Particle Sedimentation
Sedimentation of a suspension is generally assessed by a jar test, during which a suspension is allowed to settle and the height of the clear liquid (supernatant)-suspension interface is measured as a function of the settling time. In a jar test particles can be observed to settle in any of several quite different ways, dependent on their concentration and their tendency to cohere. The different modes of sedimentation make different demands on the size and shape of a settling tank, and different test procedures are used for evaluating them. In addition to particle size, density and concentration, and fluid viscosity, other less obvious factors affect the sedimentation rate. These include particle shape and orientation, convection currents in the surrounding fluid, and chemical pretreatment of the feed suspension. Particle with diameters of the order of a few microns settle too slowly for most practical operations; wherever possible these are coagulated or flocculated to increase their effective size, and hence increase their rate of settling.
Source: http://www.thermopedia.com/content/1114/
Digital Image Processing
Image processing is a method to perform some operations on an image, in order to get an enhanced image or to extract some useful information from it. It is a type of signal processing in which input is an image and output may be image or characteristics/features associated with that image. Nowadays, image processing is among rapidly growing technologies. It forms core research area within engineering and computer science disciplines too.
Image processing basically includes the following three steps:
Importing the image via image acquisition tools;
Analysing and manipulating the image;
Output in which result can be altered image or report that is based on image analysis.
There are two types of methods used for image processing namely, analogue and digital image processing. Analogue image processing can be used for the hard copies like printouts and photographs. Image analysts use various fundamentals of interpretation while using these visual techniques. Digital image processing techniques help in manipulation of the digital images by using computers. The three general phases that all types of data have to undergo while using digital technique are pre-processing, enhancement, and display, information extraction.
Source: https://sisu.ut.ee/imageprocessing/book/1
MATLAB
MATLAB (matrix laboratory) is a multi-paradigm numerical computing environment. A proprietary programming language developed by MathWorks, MATLAB allows matrix manipulations, plotting of functions and data, implementation of algorithms, creation of user interfaces, and interfacing with programs written in other languages, including C, C++, C#, Java, Fortran and Python. Uses for MATLAB include matrix calculations, developing and running algorithms, creating user interfaces (UI) and data visualization. The multi-paradigm numerical computing environment allows developers to interface with programs developed in different languages, which makes it possible to harness the unique strengths of each language for various purposes. MATLAB is used by engineers and scientists in many fields such as image and signal processing, communications, control systems for industry, smart grid design, robotics as well as computational finance.
Source: http://whatis.techtarget.com/definition/MATLAB
Claims and Evidence Citations
An Iterative Algorithm for Minimum Cross Entropy Thresholding
C.H. Li and P.K.S. Tam, March 1998
The intervals for μ1 and μ2 were chosen such that μ1 and μ2 are separated from each other and away from the maximum and minimum gray values of 255 and 0. The intervals for the standard deviations σ1 and σ2 are chosen to cover situations of minimal overlapping to high overlapping of gray levels between the foreground and the background. The proportions of the background against the foreground are in ratios ranging from 1:99 to 99:1, which should cover commonly occurring situations. A total of 1000 histograms are generated.
For each of the 1000 histograms, the threshold is obtained from both the iterative procedure and an exhaustive search. The difference in the threshold values between the iterative version and the exhaustive version is recorded for the 1000 histograms and the mean absolute difference and the standard deviation of the difference are selected as the average performance criteria for the iterative procedure.
The iterative method is started using an initial threshold value of 128 for all of the 1000 histograms. Since 128 is the middle of the gray level range of the histogram, this value is a natural one for initializing the iteration. The average error for the iterative version is only 0.39, implying that the iterative method correctly locates thresholds in more than half of the testing histogram.
Activated Sludge Monitoring Of a Wastewater Treatment Plant Using Image Analysis and Partial Least Squares Regression
A.L. Amaral, E.C. Ferreira, January 2005
In the aggregates morphological analysis, it was deemed apparent that the formation of somewhat loose, elongated structures with smaller aggregates differ from the larger aggregates in terms of smoothness and roughness respectively. The aggregates area distribution showed negligible values for the larger aggregates throughout the survey, as opposed to the clear predominant 0.1–1 mm class, which seems to point out the prevalence of normal flocs instead of pinpoint or zoogleal flocs. This fact allows inferring that the experienced bulking problems within the aerated tank were probably not of zoogleal nature and rather of a filamentous one. This hypothesis is further emphasized by the time overlap of the higher SVI period and the smaller aggregates predominance period in the survey.
The activated sludge monitoring experiment gathered important conclusions with respect to the relationships between the aggregate components and morphology, and the free filamentous bacteria contents parallel with the SVI and the TSS. Having analyzed the filaments versus solids content (TL/TSS) and the filaments versus aggregates content (TL/TA), a strong resemblance between the SVI and these two parameters behavior points was found pointing towards the existence of a filamentous bulking phenomenon. Furthermore, the TL/TSS values larger than 10,000 mm/mg clearly indicate the existence of a filamentous bulking problem.
The aggregates contents (TA) was the parameter that was found to contribute the most for the PLS analysis for the TSS with quite a satisfactory correlation regression of 0.934. Consequently it seems realistic to infer that the TSS could be satisfactorily monitored by the TA parameter, with the corresponding correlation regression of 0.906 between the predicted and observed TSS. In this study, the SVI values were quite high consistently, and only the relationships for high SVI values could be studied, whereas for a wastewater treatment plant working with satisfactory SVI values these relations may not stand true. Furthermore, as no points were present in the lower section of the regression lines the search of good correlation values may also have been hindered.
Application of Image Analysis Techniques in Activated Sludge Wastewater Treatment Processes
Ewa Liwarska-Bizukojc, July 2005
Sludge flocculation consist mainly of organic matter, which makes about 70% of their dry weight. As detailed, activated sludge suspension is a complex ecosystem, consisting primarily of bacteria and protozoa. Generally, the four main constituents in activated sludge floc can be discriminated in flocs: microorganisms (viable and dead cells), extracellular polymeric substances (mainly carbohydrates and proteins), water and inorganic particles (sand).
Flocs could be characterized by the fractal concept within a certain size limit. Along with this, it is confirmed that the large amount of extracellular polymeric substances, or exopolymers, were present within flocs. Extracellular polymeric substances acted on substrates and products transferred to and from the microbial cells in the flocs. Microorganisms, water and extracellular polymeric substances (EPS) were irregularly dispersed within the floc although the cross-sectional morphology of the flocs appeared similar. Substances to be transferred have to overcome not only the diffusional resistance of water but also of the EPS, which surround most of the microbial cells.
The problem of separating solids from treated effluent still persists in many wastewater treatment plants. The issue can mainly be attributed to the bulking of sludge caused by the excessive growth of filamentous microorganisms. Misbalance between the filamentous bacteria and flocs-forming bacteria induces bulking and foaming problems and, as a result, bad quality of the treated effluent.
Activated Sludge Morphology Characterization through an Image Analysis Procedure
Y. G. Perez, S. G .F. Leite and M. A. Z. Coelho, April 2006
Physical differences occur in different compositions of wastewater depending on the geographic profile. Other flocculations might be more resistant and consistent with other aggregates compared to those in the wastewater treatment plants. It could be explained by the fact of effluents in domestic wastewater treatment plants usually present high organic loads which favors the increasing of the floc size which is attributed to the higher production of microbial exopolysaccharides under such conditions. Sludge flocs from domestic wastewater treatment systems presented average sizes higher than those ones belonging to industrial systems.
The difference in wastewater composition vary due to the chemicals in the industrial processes the wastewater goes through. These processes affect the microbial and bio-flocculation characteristics result in the breakage of large flocs and the formation of small and structurally weak flocs that will barely settle in the clarifier leading to a final effluent with high turbidity and organic matter contents.
In the study, it can be seen that the morphological parameter Roundness correlated well with the Convexity, Compactness, and Fractal Dimension of the microbial flocs with Pearson’s correlations of -0.7421 -0.6678 and 0.6129, respectively. These values indicate, as expected, that more compact and regular flocs trend to exhibit a more spherical shape. The morphological parameters estimated from different activated sludge samples after image processing were statistically compared in order to identify the major relationships between them. The study used Pearson’s product momentum correlation coefficient to estimate linear correlations, with the coefficient range between -1 and +1. -1 corresponds to a perfect negative correlation, +1 corresponds to perfect positive correlation, and 0 signifies no relationship at all.
Simultaneously Monitoring the Particle Size Distribution, Morphology and Suspended Solids Concentration in Wastewater Applying Digital Image Analysis (DIA)
Ruey-Fang Yu, Ho-Wen Chen, Wen-Po Cheng and Mei-Ling Chu, January 2008
In the paper, various chemical, biological and physical processes were applied to enlarge particle size to enhance solid–liquid separation efficiency. This is because water and wastewater treatment, particle size distribution of the suspended solids is a major control factor for solid–liquid separation efficiency.
It was stated that more aggregates per image are obtained when images are acquired using 20x magnification. Therefore, it would be advantageous when working with higher magnifications such as 100x to increase the number of acquired images in order to increase the representativeness of the acquired data. This is further supported by the results of the experiment which demonstrate an increase in the small aggregates that were magnified 100x, unlike the aggregates with smaller size constraints. There were more large aggregates using the 20x magnification because of the size constraint of the 100x magnification for larger aggregates.
It was shown that small and intermediate aggregates remain stable in the midst of dilution with a z –value of 0.6642 in both cases they were tried on. On the other hand, the diameter of the larger aggregates decreased from the two-fold dilution until the run of four consecutive descending values, which indicates that deflocculation might have occurred. This indicates a different characterization between the small and large aggregates.
Dilution and Magnification Effects on Image Analysis Applications in Activated SludgeCharacterization
D.P. Mesquita, O. Dias, R.A.V. Elias, A.L. Amaral, and E.C. Ferreira, June 2010
Modifying the osmotic pressure during a dilution process can trigger biomass deflocculation, causing floc-forming and filamentous bacteria to be released with mixed liquor, which can change the aggregate size and morphology. According to the research, this happens when a dilution is performed the amount of screened biomass decreases, which can lead to incorrect characterization of the biomass within the biological system.
According to the results of the study, increasing the dilution caused aggregates to be more spaced out, allowing he identification of smaller aggregates and filamentous bacteria. However, increasing dilutions led to a decrease of the screened biomass in each field of view and hindered the representativeness of each sample. These conclusions were formulated after each sample was measures properly by the established parameters.
Dilutions are related to aggregate morphology assessment in image analysis techniques. The dependence on dilutions can be evaluated by assessing the behavior of each aggregates class as smaller, intermediate, and larger after dilution. The Number/Vol, TRA/Vol, Area %, and equivalent diameter for each class of samples are also related to the previously states assessment from dilutions.
Quantitative Image Analysis for the Characterization of Microbial Aggregates in Biological Wastewater Treatment: A Review
J. C. Costa, D. P. Mesquita, A. L. Amaral, M. M. Alves, and E. C. Ferreira, May 2013
The first difficulty lies in obtaining a suitable sample, which is representative of the entire wastewater treatment plant. The development of a sampling procedure that does not damage the biomass integrity and allow us to acquire sharp and useful images remains a challenge. In laboratory prototypes, it is a non-issue. However, in full-scale wastewater treatment plants, it can be seen as a hindrance. First, it will be necessary to define various strategic sampling points. Then, the sample should be homogenized without damaging the integrity and stability of flocs and/or granules; this is the most critical point of the whole process because it could lead to erroneous results. Nevertheless, the problem of sharp images can be bypassed by increasing the shutter speed available in the cameras used or by adding flashes.
The development of online quantitative image analysis systems for real-time process monitoring and control is one of the most important aspects in this field and is its ultimate goal. It may represent a significant and decisive milestone in the implementation of these procedures in wastewater treatment biological processes. Conventional methods that are currently being employed are time consuming and impossible to be implemented online. One of the main advantageous of image analysis procedures compared with classical methods is the possibility to implement an online system that can overcome the analyzer dependency and time consuming of offline systems.
Particle sedimentation studies are another important field of quantitative image analysis application since particle size distribution is one of the major parameters for solid–liquid separation efficiency and can be done efficiently through a high magnification micro lens was applied to replace the microscope, combined with a high-resolution CCD to acquire images that can be used for on-line measurements. A flow cell with a magnetic pump provided continuous samples. By combining the image analysis data and an ANN model, this method could predict the suspended solids concentration and the particle settling efficiency of samples. Therefore, it is an image analysis technique that could substitute a laser particle size analyzer, which is expensive and difficult to use for on-line measurements.
Digital Image Processing and Analysis for Activated Sludge Wastewater Treatment
Muhammad B. Khan, Xue Yong Lee, Humaira Nisar, Choon Aun Ng, Kim Ho Yeap, and Aamir Saeed Malik, 2014
High biological oxygen demand (BOM) implies that the content matter is decomposed, allowing more growth of microbes and consequently, having low dissolved oxygen (DO). Biological Oxygen Demand (BOM) is a measure of oxygen required by microorganisms to decompose organic (carbonaceous BOM) and inorganic (nitrogenous BOM) matter in wastewater. The oxygen is used by microorganisms for decomposition and needed for organisms to stay alive inside and outside the flocs.
One of the common abnormal conditions one might face in sludge processing is the rising sludge phenomenon. The abnormal condition of rising sludge occurs by excessive denitrification in which nitrogen and phosphorus are removed. The activated sludge flocs get attached to the nitrogen and float on the surface in the secondary clarifier, ultimately leading to increased turbidity, high biological oxygen demand and less settling ability.
In the study, there are two types of approaches found in the literature about the monitoring of activated sludge: one is by finding the direct correlation between physio-chemical and image analysis parameters, second by “indirect” correlation by identifying coefficients of auto-regressive exogenous (ARX) model or by training the neural network. When talking about predictions, it is based on the thought that flocs and filaments undergo change more willingly when compared to the states of the plant.
Understanding Illicit Drug Use Patterns through Wastewater Analysis
Sheree Pagsuyoin, Jana Latayan, and Babelene Pagsuyoin, April 2015
The results of the study show that the cocaine traces they tested on at the wastewater treatment plant are best described by: a single predictor for the service area (served population size), population age (below 18 years old), and employment (blue collar) categories, and two predictors (primary and tertiary levels) for the education category. They proved this correlation using a Pearson correlation analysis among the five variables.
The results of the showed that cocaine loads are strongly correlated with the population served, population below 18 years old, population with blue collar jobs, and population with tertiary level education (R2 values > 0.93, p <0.0001), and only moderately correlated with primary education (R2 > 0.5, p<0.0015). There was also a significant correlation (p<0.0001) between primary education and served population size (R2 = 0.67), population below 18 years old (R2 = 0.64), population with blue collar jobs (R2 = 0.69), and tertiary level education (R2 = 0.65).
This study’s results on the association of age predictor with cocaine use in Belgium are consistent with existing survey studies of drug use by different demographics in which the prevalence of cocaine use in ages 15−34 (2%) and school age (4%) were significantly higher than the national average of 0.9%, of which the results are based on self-reports of drug use.
Monitoring Biological Wastewater Treatment Processes: Recent Advances in Spectroscopy Applications
Daniela P. Mesquita, Cristina Quintelas, A. Luis Amaral, and Eugenio C. Ferreira, August 2017
In this study, the researchers found it important to point out that the use of chemometric techniques is quite mandatory since the visual inspection of spectral data is often not appropriate to extract significant information on sample composition, given the obtained broad and unspecific bands and molecules such as saturated hydrocarbons and sugars are not detected by this methodology and UV–Vis spectra of aqueous samples are highly affected by the presence of suspended particles due to light scattering effects.
The implementation of fluorescence instrumentation for online monitoring is relatively slow due to several factors, such as high quantities of suspended solids and fouling, among others. This is due to a significant challenge in the use of fluorescence spectroscopy as a monitoring technique is to overcome the matrix effects changing the fluorescence signature, with such effects that could mask small differences in parameters concentration.
The industry activities are the most prominent factor of water contamination, exceeding the environment regenerative capacity and self-purification, causing imbalances in the aquatic ecosystems when an appropriate treatment is not taken into account. With the influx of new technologies over the past few decades, brought progress in the society as well as its own share of problems. It is known that living organisms’ activities, urban demand, domestic consumption and industrial operations, including washing, rinsing, and cleaning equipment, generate high amounts of effluents.
0 notes
Photo
Research Papers: A Short Introduction and Its Significance
An Iterative Algorithm for Minimum Cross Entropy Thresholding
C.H. Li and P.K.S. Tam, March 1998
A fast iterative method is derived for minimum cross entropy thresholding using a one-point iteration scheme. Simulations performed using synthetic generated histograms and a real image show the speed advantage and the accuracy of the iterated version.
“Image thresholding based on the gray level histogram is an efficient and important tool for image segmentation (Haralick and Shapiro, 1992). Various algorithms have been proposed to solve this problem. Among them, Li and Lee 1993 introduced the minimum cross entropy thresholding algorithm for thresholding by selecting the threshold which minimizes the cross entropy between the segmented image and the original image. In this paper, a fast iterative method is derived for the minimum cross entropy method.”
This paper showed an iterative algorithm that can significantly reduce computation times for faster and more efficient operations. This is done by minimizing the cross entropy between the segmented image and original image, leading to less noise and a more detailed accuracy.
Activated Sludge Monitoring Of a Wastewater Treatment Plant Using Image Analysis and Partial Least Squares Regression
A.L. Amaral, E.C. Ferreira, January 2005
The biomass present in a wastewater treatment plant was surveyed and their morphological properties related with operating parameters such as the total suspended solids (TSS) and sludge volume index (SVI) and image analysis was used to provide the morphological data. The results denoted the existence of a severe bulking problem of non-zoogleal nature and the PLS analysis revealed a strong relationship between the TSS and the total aggregates area as well as a close correlation between the filamentous bacteria per suspended solids ratio and the SVI.
“The activated sludge is a complex ecosystem mainly of bacteria and protozoa. In fact, a good balance between the bacteria and protozoa and among the different species of each group is crucial for efficient pollution removal, good settleability properties and low suspended solids effluent levels. Bacteria agglomerate as aggregates, mainly due to exopolymers excretion and a filamentous bacteria backbone towards an ideal floc with reasonable size and well balanced floc-forming and filamentous bacteria composition. Furthermore, they should be also quite robust and have good settling capabilities, hence leading to a low organic matter and low turbidity final effluent.”
This paragraph helped established the definition of activated sludge in the research paper. This laid down the foundation of how the bacteria in activated sludge behaves, which is a significant factor in the digital image processing and analysis. Without knowing the test subject’s components, the results would be inconsistent and useless.
Application of Image Analysis Techniques in Activated Sludge Wastewater Treatment Processes
Ewa Liwarska-Bizukojc, July 2005
Image analytical techniques have been developed to evaluate microbial aggregates like sludge flocculations. The paper also discussed the latest innovations concerning image the image analysis in sludge wastewater systems with respect to the most frequently used morphological parameters and relations between them and traditional wastewater treatment parameters.
“Additionally, image analysis programmes are elaborated by groups of scientists and available as a public domain. Here, an example is ImageJ 1.33 (http://rsb.info.- nih.gov/ij/) elaborated by the Research Services Branch of National Institutes of Health (USA) and DAIME (Digital Image Analysis in Microbial Ecology), created in the Department of Microbial Ecology of Vienna Ecology Centre (University of Vienna). DAIME is a novel computer programme that integrates digital image analysis and 3-D visualisation functions. It can analyse the digital images from epifluorescence microscopes and confocal image stacks (CLSM).”
There are several examples of commercial image analysis software packets which are usually offered by the companies that deliver microscopes and imaging systems. This paragraph provided the research with examples of different software that can be utilized to digitally process and analyze the sludge wastewater samples.
Activated Sludge Morphology Characterization through an Image Analysis Procedure
Y. G. Perez, S. G .F. Leite and M. A. Z. Coelho, April 2006
This study deals with the development of a digital image analysis procedure to characterize microbial flocs obtained in three different wastewater treatment plants. The developed procedure permits to obtain its morphological parameters like equivalent diameter, compactness, roundness and porosity properties as well as the fractal dimension. This procedure was validated and lead to identify the major relationships between the analyzed morphological parameters.
“The first step of the procedure consisted on the conversion of RGB images into grey-level images and the subsequent background correction to reduce the uneven background intensities caused by uneven lighting from image acquisition system. The greylevel images were then enhanced by means of the image processing tools (histogram equalization and median filtering).”
This excerpt is from the digital imaging process which guides the researcher on how the procedure should be conducted. It emphasizes how the image samples should be evened out into greyscale images with background correction to produce useful results.
Simultaneously Monitoring the Particle Size Distribution, Morphology and Suspended Solids Concentration in Wastewater Applying Digital Image Analysis (DIA)
Ruey-Fang Yu, Ho-Wen Chen, Wen-Po Cheng and Mei-Ling Chu, January 2008
Sedimentation is one of the most commonly used processes for particle separation in water and wastewater treatments. In this paper, the proponents observed the particle size distribution, morphology, and suspended solids concentration in wastewater through digital image analysis wherein they utilized a laser particle size analyzer to analyze the particle distribution.
“The efficiency of particle sedimentation in wastewater treatment is seriously affected by particle size distribution and morphology.”
This is an essential piece of information since it puts the factor of morphology into perspective and consideration in assessing the digitally processed and analyzed image results.
Dilution and Magnification Effects on Image Analysis Applications in Activated Sludge Characterization
D.P. Mesquita, O. Dias, R.A.V. Elias, A.L. Amaral, and E.C. Ferreira, June 2010
In activated sludge wastewater systems, accurate sludge characterization requires the samples to be diluted, especially when working with high biomass content. Subsequently, the said characterizations are to be done through image analysis procedures. In this paper, the proponents studied the effects of dilution and magnification in assessing aggregated filamentous content in image analyses. Assessments of biomass content and structure were affected by dilutions. Therefore, the correct operating dilution requires careful consideration.
“This study demonstrated the vulnerability of image analysis to dilutions. We show that it is impossible to predict and quantify the dilution effect for the aggregated and filamentous biomass as a whole, on the basis of a single correction factor.”
This paragraph emphasized that it is not sufficient enough to test on a single determining factor. Therefore, the optimal operating dilution must be carefully established, and the determination of the aggregates recognition percentage could be a valuable methodology for identifying this dilution factor.
Quantitative Image Analysis for the Characterization of Microbial Aggregates in Biological Wastewater Treatment: A Review
J. C. Costa, D. P. Mesquita, A. L. Amaral, M. M. Alves, and E. C. Ferreira, May 2013
The paper discussed different quantitative image analysis techniques in the field of research over the last decade, which includes the objective prediction of filamentous bulking. It also demonstrated its usefulness in classifying protozoa and metazoan populations. In high rate anaerobic processes, aggregation times and fragmentation phenomena could be detected during critical events. This work also details the major efforts being undertaken to develop more advanced quantitative image analysis techniques.
“The overall process can be summarized in three main steps: 1. Sample preparation and image acquisition: The object to be analyzed should be carefully sampled and prepared according to the study objective…2. Image processing: The set of operations performed to transform a raw input image into a final output image…3. Image analysis: Finally, the morphological parameters are analyzed.”
This paragraph outlined the basic steps of the digital imaging process and analysis so as to help the researcher, and even the readers, fully comprehend the process behind the procedure. These steps will then help gauge the correctness of the steps followed in the research.
Digital Image Processing and Analysis for Activated Sludge Wastewater Treatment
Muhammad B. Khan, Xue Yong Lee, Humaira Nisar, Choon Aun Ng, Kim Ho Yeap, and Aamir Saeed Malik, 2014
Activated sludge system is generally used in wastewater treatment plants for processing domestic influent. Activated sludge is monitored by measuring the sludge volume index (SVI) and similar entities exclusive to sludge wastewater. Digital image processing and analysis offers a better alternative in monitoring the present sludge state to predicting future trends. It is observed that image analysis can play a very useful role in the monitoring of activated sludge wastewater treatment plants.
“There have been a different set of parameters to characterize the organic and inorganic content of wastewater for different perspectives. For example, SVI is a good indicator of settle-ability. The following are the parameters often found in literature to characterize wastewater and/or activated sludge...”
This paragraph and the outline that follows described the different characteristic of sludge wastewater that may affect the data sampling, treatment, processing and analysis. It also stated the different abnormal conditions that can lead to discrepancies in the result translations.
Understanding Illicit Drug Use Patterns through Wastewater Analysis
Sheree Pagsuyoin, Jana Latayan, and Babelene Pagsuyoin, April 2015
Illicit drug use is an ever-present problem that has serious consequences on the health and socioeconomic well-being of a person, and a society as a whole. Its identification and estimation in communities is difficult because of the privacy laws and doctor-patient confidentiality agreements. Various strategies and research methodologies rely on surveys and reports from health and other government agencies. In this study, the proponents employed a relatively new approach called “wastewater epidemiology” to study the patterns of cocaine use in Belgium with real-time data and a non-invasive alternative to understanding the patterns of drug use in communities.
“Considering the evolving patterns of drug consumption within communities and with the emergence of new and more complex illicit substances in the market, newer methods for collecting relevant information are necessary in order for policymakers to develop a timely and more effective course of action. One such method is wastewater epidemiology, which refers to the study of illicit drug use patterns in the community through wastewater analysis. In principle, it involves the detection in wastewater of target compounds that are excreted by humans; these measurements are then used to back-calculate the consumption levels of these compounds in the community served by the sewer network.”
This paper provided a different perspective on the importance of looking into sludge wastewater sample and analyzing them to detect drug traces in a certain population demographic. This proves to be another function of sludge wastewater processing and analysis.
Monitoring Biological Wastewater Treatment Processes: Recent Advances in Spectroscopy Applications
Daniela P. Mesquita, Cristina Quintelas, A. Luis Amaral, and Eugenio C. Ferreira, August 2017
Aerobic and anaerobic processing technologies have been continuously developed for wastewater treatment and are currently routinely employed to reduce the contaminant discharge levels in the environment. However, most methodologies are labor intensive and time. Thus, spectroscopy applications in biological processes are, nowadays, considered a rapid and effective alternative technology for real-time monitoring though still lacking implementation in full-scale plants.
“…considerable efforts are being undertaken to develop spectroscopic methods for monitoring and quantifying key parameters involved in WWT processes.”
This excerpt from the study guarantees that developments in the field of spectroscopic applications are being developed to meet the need of the different wastewater treatment plants and their procedures. In turn, these endeavors aim to ease the costly, labor-intensive and time-consuming traditional processes with faster alternative technologies.
Literature Review
Activated Sludge Wastewater Monitoring through Digital Image Processing and Analysis
Most Asian cities do not have access to proper wastewater treatment systems and facilities. In the Philippines alone, only 10 percent of the wastewater is treated. As a response to these issues, the Department of Science and Technology (DOST) thrusted the advocacy for cleaner and safer technologies in the treatment of wastewater and to provide sustainable measures for the pressing environmental repercussions of untreated and unprocessed wastewater.
Inside the Sludge Wastewater Treatment
Activated sludge is a fairly complex ecosystem made up of bacteria and protozoa of different species, most of which embody their own roles for efficient pollution removal. Bacteria converge into aggregated lumps as a result of the flocculation of extracellular polymeric substances (EPS) excreted by bacteria that consists of floc-forming filamentous bacteria (Amaral, A., et al., 2005). Bacteria in sludge are robust, due to the nature of their environment, causing in adequate compacting and settling, resulting to low organic matter and turbidity.
The characteristic of wastewater will not only benefit researchers, but also for the wastewater treatments plants (WWTP). This can be attributed to two main reasons. Firstly, activated sludge is one of the most often means for microbiological degradation in wastewater. Secondly, flocs determine the quality of effluent and the efficiency of WWTP, thus proving to be an undoubtedly major component in wastewater purification.
Aside from alleviating environmental issues, wastewater monitoring can be an avenue for drug tracing in the city sewage systems, canals, and household septic tanks (Pagsuyoin, S., et al., 2015). It involves the detection in wastewater of target compounds that are excreted by humans; these measurements are then used to back-calculate the consumption levels of these compounds in the community served by the sewer network
According to Khan, the activated sludge process employs some definitions specific to the process itself. Two of them are Mixed Liquor suspended Solids (MLSS) and Mixed Liquor Volatile suspended Solids (MLVSS), which are the total suspended solids and volatile suspended solids respectively. Hydraulic Retention Time is the mean time the primary effluent stays in the aeration tank. Sludge Age, also known as Mean Cell Retention Time, is the average residence time of the microorganisms in the activated sludge system. Lastly, we have the Sludge Volume Index (SVI) which is defined as the volume a gram of activated sludge preceded by settling of the mixed liquor. Aerobic wastewater treatment, also known as activated sludge process, utilizes microorganisms to oxidize both organic and inorganic matter. The process consists primarily of aeration tanks and clarifiers which act as catalysts for the microbial flocs to settle in a secondary clarifier (Khan, M.B., et al., 2014).
Digital Imaging Process
Sludge wastewater digital imaging begins with the acquisition of filamentous image samples through phase contrast microscopy and magnified up to a certain extent to produce aggregate image (Amaral, A. et al., 2005). The following step of the process consist of the conversion of RGB images into greyscale images and the subsequent background correction to reduce the uneven background intensities caused by uneven lighting or any lighting discrepancies. The greyscale images are then enhanced by means of the image processing tools such as histogram equalization and median filtering (Perez, Y.G., et al., 2006). Other process like aggregate segmentation, filaments segmentation, and debris elimination consists primarily of the determination of the different components in the sludge sample previously acquired. Aggregates are labeled, noise and debris are cancelled to produce clearer pictures. According to Costa (et al., 2013), the overall image analysis process can be analyzed in three steps: sample preparation and image acquisition, image processing, and image analysis. Sampling is marked by pre-treatment, image processing is the transformation of raw input into a final output image, and image analysis concerns the determination of descriptors describing the object morphology, the size and shape, which also affects the efficiency of particle sedimentation in wastewater treatment (Yu, R., et al., 2008).
It is essential to produce image samples of good quality to ensure consistency in producing results and accuracy to translating the results. To achieve this, researchers have formulated different algorithms to guarantee the performance and results of the imaging process and analysis. One of them is an iterative algorithm for minimum cross entropy thresholding which allows for an accurate threshold location with the minimum number of computations required (Li, C.H., et al., 1998). It is done by selecting the threshold which minimizes the cross entropy of the image. This algorithm only but a speck in the dozens of existing algorithms which can further enhance the digital imaging process, suitable for different specifications.
Along with this come different microscopic techniques and commercial image analysis software by microscopes and digital imaging systems companies, such as ImageJ and DAIME (Digital Image Analysis in Microbial Ecology) created by the University of Vienna (Liwarska-Bizukojc, E., 2005). One of the most popular software being used today is MATLAB, which is used for computer vision, deep learning, robotics, and the like.
In the review paper “Monitoring biological wastewater treatment processes: recent advances in spectroscopy applications,” (Mesquita, D., et al., 2017), the proponents have addressed different aerobic and anaerobic spectroscopy systems that are currently being developed for WWTP. The review detailed the efforts being undertaken in developing spectroscopic methods for monitoring and quantifying key parameters in the wastewater treatment process, subsequently replacing conventional methods, which are quite time-consuming, costly, and laborious.
Special Sludge Factors and Conditions
While it is important to familiarize the process, it is also essential to know about the possible hindrances in effective digital imaging procedures that may affect the quality of the sludge wastewater samples. In a study by Mesquita (et. al, 2010) entitled “Dilution and Magnification Effects on Image Analysis Applications in Activated Sludge Characterization,” the proponents emphasized the vulnerability of image analyses to dilutions. According to their study, it is impossible to predict and quantify the dilution effect for the aggregated and filamentous biomass as a whole, on the basis of a single correction factor. Therefore, the optimal operating dilution must be carefully established, and the determination of the aggregates recognition percentage could be a valuable methodology for identifying this dilution factor. Along with this, other researchers have also discussed possible abnormal conditions that may affect the interpretation of results. These include dispersed growth, which is caused by high turbidity of effluent, viscous or non-filamental bulking, manifested by reduced settling and compaction, pinpoint flocs, which are due to low SVI, filamentous bulking, caused by overgrowth of filamentous bacteria, and foaming or scum formation, which is caused by non-biodegradable surfactants like detergents.
REFERENCES
Li, C., & Tam, P. (1998). An iterative algorithm for minimum cross entropy thresholding. Pattern Recognition Letters,19(8), 771-776. doi:10.1016/s0167-8655(98)00057-9
Amaral, A., & Ferreira, E. (2005). Activated sludge monitoring of a wastewater treatment plant using image analysis and partial least squares regression. Analytica Chimica Acta,544(1-2), 246-253. doi:10.1016/j.aca.2004.12.061
Liwarska-Bizukojc, E. (2005). Application of Image Analysis Techniques in Activated Sludge Wastewater Treatment Processes. Biotechnology Letters,27(19), 1427-1433. doi:10.1007/s10529-005-1303-2
Perez, Y. G., Leite, S. G., & Coelho, M. A. (2006). Activated sludge morphology characterization through an image analysis procedure. Brazilian Journal of Chemical Engineering,23(3), 319-330. doi:10.1590/s0104-66322006000300005
Yu, R., Chen, H., Cheng, W., & Chu, M. (2008). Simultaneously monitoring the particle size distribution, morphology and suspended solids concentration in wastewater applying digital image analysis (DIA). Environmental Monitoring and Assessment,148(1-4), 19-26. doi:10.1007/s10661-007-0135-z
Mesquita, D., Dias, O., Elias, R., Amaral, A., & Ferreira, E. (2010). Dilution and Magnification Effects on Image Analysis Applications in Activated Sludge Characterization. Microscopy and Microanalysis,16(05), 561-568. doi:10.1017/s1431927610093785
Costa, J. C., Mesquita, D. P., Amaral, A. L., Alves, M. M., & Ferreira, E. C. (2013). Quantitative image analysis for the characterization of microbial aggregates in biological wastewater treatment: a review. Environmental Science and Pollution Research,20(9), 5887-5912. doi:10.1007/s11356-013-1824-5
Khan, M. B., Lee, X. Y., Nisar, H., Ng, C. A., Yeap, K. H., & Malik, A. S. (2014). Digital Image Processing and Analysis for Activated Sludge Wastewater Treatment. Signal and Image Analysis for Biomedical and Life Sciences Advances in Experimental Medicine and Biology,227-248. doi:10.1007/978-3-319-10984-8_13
Pagsuyoin, S., Latayan, J., & Pagsuyoin, B. (2015). Understanding illicit drug use patterns through wastewater analysis. 2015 Systems and Information Engineering Design Symposium. doi:10.1109/sieds.2015.7116994
Mesquita, D. P., Quintelas, C., Amaral, A. L., & Ferreira, E. C. (2017). Monitoring biological wastewater treatment processes: recent advances in spectroscopy applications. Reviews in Environmental Science and Bio/Technology,16(3), 395-424. doi:10.1007/s11157-017-9439-9
0 notes
Photo
Break the Stigma: Mental Health and Technology
The world has long been fighting a battle against mental health. Bills are being enacted and different technological advancements are being researched and developed in order to aid the field of psychology, neuroscience, and the like. In the Philippines, 17 to 20 percent of Filipinos suffer from a psychiatric disorder, according to the National Statistics Office (2016). In the late quarter of 2017, the Senate passed a law, dubbed as the Mental Health Act of 2017. This is a sign of the growing awareness of the people to delve into the matters of the mind and doing what is possible to solve the problems that come with it.
Understanding Mental Health and Illness
The World Health Organization defines mental health as “a state of well-being in which the individual realizes his or her own abilities, can cope with the normal stresses of life, can work productively and fruitfully, and is able to make a contribution to his or her community.” However, some believe that this definition shies away from the conceptualization of mental health, portraying it as an absence of mental illness which can lead to misconceptions and potential misunderstandings in defining the actual state if a healthy mind.
Thus, scientists have proposed a newer, more inclusive definition: Mental health is a dynamic state of internal equilibrium which enables individuals to use their abilities in harmony with universal values of society. Basic cognitive and social skills; ability to recognize, express and modulate one's own emotions, as well as empathize with others; flexibility and ability to cope with adverse life events and function in social roles; and harmonious relationship between body and mind represent important components of mental health which contribute, to varying degrees, to the state of internal equilibrium.
Mental illnesses or disorders, are defined by the Diagnostic and Statistical Manual of Mental disorders (DSM-IV) as a “mental disorder is a psychological syndrome or pattern which is associated with distress (e.g. via a painful symptom), disability (impairment in one or more important areas of functioning), increased risk of death, or causes a significant loss of autonomy; however it excludes normal responses such as grief from loss of a loved one, and also excludes deviant behavior for political, religious, or societal reasons not arising from a dysfunction in the individual.”
Using Technology in Psychiatric Treatment
Technology has opened up a new avenue for the public, doctors, researchers, and scientists in mental health support and data collection. With mobile devices such as smartphones, tablets and smart watches provide new ways to access, monitor, and further understand the meaning of mental welfare. Nowadays, people often associate poor mental health with the excessive use of gadgets and other forms of technology. It may be true, but fortunately, researchers have found ways to incorporate mental wellness apps into our mobile phones available on-the-go.
The use of such technologies as a supplement to mainstream therapies for mental disorders is an emerging mental health treatment field which could improve the accessibility, effectiveness and affordability of mental health care. As technology evolved, psychiatrists began to employ virtual reality as a means for the mentally ill to cope will their conditions. And with proper support, digital interventions are as effective as face-to-face treatments (Andersson et al., 2014; Cuijpers, Donker, van Straten, Li, & Andersson, 2010).
Artificial Intelligence (AI)
We often associate AI as a non-sentient being conversing with us as virtual assistants, like Siri or Alexa, helping us with the mundane tasks of everyday life. But in reality, AI goes beyond that constraint. Now, artificial intelligence is being used to revolutionize mental healthcare.
AI applications like Tess by X2_AI, IBM’s Watson, and Google Deepmind AI are called psychological AI. They interact with the users with the aim of providing psychotherapy and cognitive behavioral therapy. According to IBM Research: “Cognitive computers will analyze a patient’s speech or written words to look for tell-tale indicators found in language, including meaning, syntax and intonation. Combining the results of these measurements with those from wearables devices and imaging systems (MRIs and EEGs) can paint a more complete picture of the individual for health professionals to better identify, understand and treat the underlying disease, be it Parkinson’s, Alzheimer’s, Huntington’s disease, PTSD or even neurodevelopmental conditions such as autism and ADHD.”
Through artificial intelligence, mental assessments and treatment can ber very possible at the comfort of our own homes.
Virtual Reality (VR)
Virtual Reality is an effective way to immerse patients in an artificial state that allows them to simulate the experience without physically going through it again for them to be able to cope and adjust – the best way being through virtual reality.
Psious, a Spanish-American behavioral health technology company whose main product is the PsiousToolsuite, a virtual reality platform aimed at bringing value to mental health treatment. This application allows the user to enter into a therapy session at the comfort of your present location, like the office break room, for instance. Psious was developed primarily as a means of exposure therapy. The gizmo can simulate a free-fall experience for someone who is afraid if heights without having to actually go through the process of parachuting across the sky.
In the study “Virtual reality therapy for agoraphobic outpatients in Lima, Peru” (Suyo, M.I., et al., 2015), the proponents tested eight patients of both sexes with clinical diagnosis of agoraphobia. Subjects were exposed to virtual reality environments generated by Psious Virtual Reality application for agoraphobia treatment and skin conductance (measured in microsiemmens) and scale of subjective units of anxiety (SUDS) were recorded while the patient was exposed to virtual environment that provoke anxiety; they was measured by 5 sessions. They drew to the conclusion that all eight patients had clinical improvement, six patients improved more than 50%, with statistically significant results
A study spearheaded by Dr. Albert Rizzo (2006) entitled “BRAVEMIND: Advancing the Virtual Iraq/Afghanistan PTSD Exposure Therapy for MST” (Rizzo, A., 2006), focuses on helping patients with Post Traumatic Stress Disorder (PTSD). Bravemind simulates a war zone, like Iraq, to activate "extinction learning" which can deactivate a deep-seated "flight or fight response," relieving fear and anxiety. Along with PTSD, Bravemind can also treat traumas of sexual abuse.
Mobile Apps
Mobile mental health apps are on the rise because they give us an affordable and accessible solution, not to mention, anonymity. Developers have created different mobile applications that cater to different needs like insomnia, panic attacks, anxiety, and the like. There are also applications for meditation and breathing exercises to help the user calm down in times of stress. There are established digital treatments for depression, anxiety disorders, and even insomnia (Andersson & Titov, 2014). They are self-help programs designed either to be used on their own or with some form of support. These treatments vary in their content, clinical range, format, functionality and mode of delivery. For inspiration, positivity and mood tracking apps also exist, along with apps that allow the user to vent about their feelings to get them off their chest.
One notable app is called Mindstrong. It monitors smartphone behavior with permission from the user. According to Dr. Thomas Insel (2017), one of the researchers behind this app, if a user starts typing more rapidly than normal, their syntax changes, or they indulge in impulsive shopping sprees that might be an indicator that they're manic. If they don't respond to texts from family and friends, they might be depressed. Together, this data collection could create what is referred to as a "digital phenotype," which could be described as a personalized mental health map.
Conclusion
Various studies have shown the importance and benefits in using technology to overcome, or at least alleviate, mental illnesses. The world is a long way from completely figuring out and solving these problems, but with the help of the constant evolution of technology, it might be very possible in the not too distant future.
Mental health is a serious matter that should be given utmost attention and concern. It might be seen as impossible to eradicate this problem, but as St. Francis of Assisi said, “Start by doing what’s necessary, then do what’s possible; and suddenly you are doing the impossible.”
REFERENCES
Andersson, G., Titov, N. (2014). Advantages and limitations of Internet-based interventions for common mental disorders. World Psychiatry, pp. 4-11
Andersson, G., Cujipers P., Carlbring P., Riper H., Hedman, E. (2014). Guided internet-based vs. face-to-face cognitive behavior therapy for psychiatric and somatic disorders: A systematic review and meta-analysis. World Psychiatry, pp. 288-295
Farr, C. (2017, May 10). Former Alphabet exec is working on an idea to detect mental disorders by how you type on your phone. Retrieved January 20, 2018, from https://www.cnbc.com/2017/05/10/thomas-insel-ex-alphabet-mindstrong-track-mental-health-smartphone-use.html
Nimh.nih.gov. (2017). NIMH » Technology and the Future of Mental Health Treatment. [online] Available at: https://www.nimh.nih.gov/health/topics/technology-and-the-future-of-mental-health-treatment/index.shtml [Accessed 20 Jan. 2018].
Research.ibm.com. (2018). With AI, our words will be a window into our mental health- IBM Research. [online] Available at: http://research.ibm.com/5-in-5/mental-health/ [Accessed 20 Jan. 2018].
Rizzo, A. (2016). BRAVEMIND: Advancing the Virtual Iraq/Afghanistan PTSD Exposure Therapy for MST.
Stein, Dan J; Phillips, K.A; Bolton, D; Fulford, K.W.M; Sadler, J.Z; Kendler, K.S (November 2010). "What is a Mental/Psychiatric Disorder? From DSM-IV to DSM-V". Psychological Medicine. London: Cambridge University Press. 40 (11): 1759–1765. doi:10.1017/S0033291709992261. ISSN 0033-2917. OCLC 01588231. PMC 3101504 Freely accessible. PMID 20624327.
U.S. Department of Health and Human Services. Mental health: a report of the Surgeon General. Rockville: U.S. Public Health Service; 1999.
Vásquez Suyo, M.I. et al. (2015). Virtual reality therapy for agoraphobic outpatients in Lima, Peru. European Psychiatry , Volume 33 , S397 - S398
World Health Organization. Promoting mental health: concepts, emerging evidence, practice (Summary Report) Geneva: World Health Organization; 2004.
1 note
·
View note
Text
We live in a world where information is being recognized as a highly valuable asset (Daniel Moody, et. al., 2004). It keeps the gears of the commercial world spinning. Whether your organisation is large or small, if you do not understand your information, you cannot fully protect and exploit it. This is where data mining comes in. It is important because it’s scope is vast. Data miners do not just gather data for the sake of doing so. It is used in various fields by various organizations in predicting patterns, situational outcomes, and the like. It is also used by applications to know its user behavior and optimize it accordingly. It is used by commercial organizations to achieve various objectives and goals. In order to gather significant and usable data, data miners focus on taking a look at the characteristics data should have for it to be significant to its related variables, which translates to what we call correlation analysis.
Correlation is defined as the means to see how elements of a given problem may be interacting with one another. If you find yourself asking how certain factors in a problem you’re trying to solve interact with one another, building a correlation matrix is the way find out. For instance, does customer satisfaction change based on different seasons? Does the amount of rainfall change the price of a crop? Does household income influence which shoe brands people are more likely to purchase? The answer to each of these questions is probably ‘yes’, but correlation can not only help us know if that’s true, but it can also help us learn how strongly the interactions are when they occur, if they even occur at all.
There are two main types of correlation coefficients: Pearson's product moment correlation coefficient and Spearman's rank correlation coefficient. The proper usage of correlation coefficient type depends on the types of variables being studied. Pearson's product moment correlation coefficient is used when both variables being studied are normally distributed. This coefficient is affected by extreme values, which may exaggerate or cloud the strength of relationship, and is therefore inappropriate when either or both variables are not normally distributed. Spearman's rank correlation coefficient is appropriate when one or both variables are skewed or ordinal and is robust when extreme values are present.
To further expand our understanding of correlation, it is best to take a look at how a correlation coefficient is calculated (Taylor, Courtney, 2017). To see exactly how the value of r is obtained, let us look at this example. We begin with a listing of paired data: (1, 1), (2, 3), (4, 5), (5,7). The mean of the x values, the mean of 1, 2, 4, and 5 is x̄ = 3. We also have that ȳ = 4. The standard deviation of the x values is sx = 1.83 and sy = 2.58. The table below summarizes the other calculations needed for r. The sum of the products in the rightmost column is 2.969848. Since there are a total of four points and 4 – 1 = 3, we divide the sum of the products by 3. This gives us a correlation coefficient of r = 2.969848/3 = 0.989949. A correlation coefficient close to 1.0 indicates that the two fields are strongly correlated. This means that if the value of one field is high, the value of the other field also is likely to be high. A correlation coefficient close to 0 indicates that the two fields are not correlated. This means that the information about one field value cannot be derived from the information of the other field value.
Data mining and correlation is a very broad field, thus it has many useful applications. The medical field is but one of a hoard of expertise correlation is best used for. Simple application of the correlation coefficient can be exemplified using data from a sample of 780 women attending their first antenatal clinic visits (M. Mukaka, 2012). We can expect a positive linear relationship between maternal age in years and parity because parity cannot decrease with age, but we cannot predict the strength of this relationship. Correlation also plays a role in the field of education. Student score analysis is an important aspect in the educational research (Lu Dai, et. al., 2011). The use of multivariate methods in the score analysis is essential to teachers or administrators who intend to explore more information from available score data. There is also correlation in finance. The presence of significant cross-correlations between the synchronous time evolution of equity returns is a well-known empirical fact. The Pearson correlation is commonly used to indicate the level of similarity in the price changes for a given pair of stocks, but it does not measure whether other stocks influence the relationship between them. To explore the influence of a third stock on the relationship between two stocks, we use a partial correlation measurement to determine the underlying relationships between financial assets. These are only a few of the uses of correlation in real life. It can also be applied in Life Sciences (LS), Customer Relationship Management (CRM), web applications, manufacturing, retail, banking, security, climate modeling, and astronomy, to name a few. Without correlation, data analysis would be much, much difficult without knowing their relationships.
As an illustration, let us take a look at the dataset: Major League Baseball (MLB) Wins Above Avg. By Position, using RapidMiner. This dataset encompasses the statistics of the Major League Baseball (MLB) Wins Above Avg. By Position. The dataset has 17 attributes, giving it a size of 17x17. As you can see in the figure above, all of the figures have a strong correlation of 1.0. The book states that if the coefficients lie between 0 and 1, it represents a positive correlation, and when the coefficient lies between 0 and -1, it represents a negative correlation. Positive correlations signify that as one attribute’s value rises, so do the other values, and vice versa.
In the dataset, Major League Baseball (MLB) Wins Above Avg. By Position 2016-2017, we have the following attributes: All P (Pitcher), SP (Starting Pitcher), RP (Relief Pitcher), Non-P, C (Catcher), 1B (First Baseman), 2B (Second Baseman), 3B (Third Baseman), SS (Shortstop), LF (Left Fielder), CF (Center Fielder), RF (Right Fielder), OF (Outfielder), DH (Designated Hitter), and PH (Pinch Hitter). From the given figure, all attributes have a strong correlation.
BIBLIOGRAPHY
Dai L., Chen J., Li S., Dai S. (2011) Application of Canonical Correlation Analysis in Student Score Analysis Based on Data Analysis. In: Lin S., Huang X. (eds) Advances in Computer Science, Environment, Ecoinformatics, and Education. CSEE 2011. Communications in Computer and Information Science, vol 217. Springer, Berlin, Heidelberg
Moody, D. L., & Walsh, P. (1999, June). Measuring the Value Of Information-An Asset Valuation Approach. In ECIS (pp. 496-512).
Mukaka, M. (2012). A guide to appropriate use of Correlation coefficient in medical research. Malawi Medical Journal : The Journal of Medical Association of Malawi, 24(3), 69–71.
Taylor, Courtney. (2017, August 11). How to Calculate the Correlation Coefficient. Retrieved from https://www.thoughtco.com/how-to-calculate-the-correlation-coefficient-3126228
0 notes
Photo
Background Check
I am Ira Mae P. Gamban. I’m an 18-year-old junior university student, studying for a degree in Bachelor of Science in Computer Science at the University of Southeastern Philippines. I was born on June 5th ’99 in Cebu City. When I was younger, I used to spend most of my time tinkering with the desktop unit we had at home. I wanted to learn everything that had to do with computers and technology. I learned how to design my own web pages using HTML and CSS. I taught myself the basics of computer logic. Yet as I grew up, the flame of my first interest slowly died down. As Daphne du Maurier would put it, “I am glad it cannot happen twice, the fever of first love. For it is a fever, and a burden, too, whatever the poets may say.”
But, I guess you could say that the fever is still hidden somewhere in my subconscious. Come the last year of high school. Everyone is set on taking the National College Admissions Examination in hopes of finding the perfect degree program for their skill sets. Surprisingly, or unsurprisingly, my results showed ‘web development’ as the top result. Long story short, our family relocated to Davao City and I enrolled in USeP at the Institute of Computing, with the BS Computer Science prospectus in hand.
Expectations
“Life's under no obligation to give us what we expect.” ― Margaret Mitchell
This quote never rang so true to me until I started the first semester in my first year of college. You would think that someone with a passion to learn about logic and computers would have an easy time studying about it. Wrong. As soon as the culture shock sets in, life seems more stressful. And it is. You never run out of things to do and you forget everything you seem to know.
In this course, CS Free Elective 1, I’m expecting it to be as hard, if not tougher than the courses we’ve passed before. This subject is where we can expand and broaden our horizons for the ‘real world,’ so to speak. I’m predicting group projects, or partnerships, written exams, oral presentations, and other means to test our learning proficiency. If there’s something our professors have taught us, it’s to never limit our learning process within the four walls of our classrooms, and figuratively, this is the time to do so, as we look for our specializations.
Preferred Elective
From the elective choices, the one that caught my eye was Machine Learning. In our freshman year, we were asked to do a research about anything related to our course.The topic I chose was related to AI that were given the ability to learn things without having been explicitly programmed, much like the human brain itself.
With the constant evolution of the field, there has been a subsequent raise in the uses, demands, and the importance of machine learning. The answer to the question as to why one has to adopt machine learning would be: ‘High-value predictions that can guide better decisions and smart actions in real time without human intervention.’
- Source: SAS
Machine learning is important because as machines are exposed to new data, they are able to independently adapt, develop, grow, and learn. They learn from previous computations to produce reliable, repeatable decisions and results. Some applications of machine learning are fraud detection and pattern and face recognition, other subjects I am also very interested in.
1 note
·
View note