913 resultados para Video interaction analysis : methods and methodology
Resumo:
Rapid economic development has occurred during the past few decades in China with the Yangtze River Delta (YRD) area as one of the most progressive areas. The urbanization, industrialization, agricultural and aquaculture activities result in extensive production and application of chemicals. Organohalogen contaminants (OHCs) have been widely used as i.e. pesticides, flame retardants and plasticizers. They are persistent, bioaccumulative and pose a potential threat to ecosystem and human health. However, limited research has been conducted in the YRD with respect to chemicals environmental exposure. The main objective of this thesis is to investigate the contamination level, distribution pattern and sources of OHCs in the YRD. Wildlife from different habitats are used to indicate the environmental pollution situation, and evaluate selected matrices for use in long term biomonitoring to determine the environmental stress the contamination may cause. In addition, a method is developed for dicofol analysis. Moreover, a specific effort is made to introduce statistic power analysis to assist in optimal sampling design. The thesis results show extensive contamination of OHCs in wildlife in the YRD. The occurrences of high concentrations of chlorinated paraffins (CPs) are reported in wildlife, in particular in terrestrial species, (i.e. short-tailed mamushi snake and peregrine falcon). Impurities and byproducts of pentachlorophenol products, i.e. polychlorinated diphenyl ethers (PCDEs) and hydroxylated polychlorinated diphenyl ethers (OH-PCDEs) are identified and reported for the first time in eggs from black-crowned night heron and whiskered tern. High concentrations of octachlorodibenzo-p-dioxin (OCDD) are determined in these samples. The toxic equivalents (TEQs) of polychlorinated dibenzo-p-dioxin (PCDDs) and polychlorinated dibenzofurans (PCDFs) are at mean levels of 300 and 520 pg TEQ g-1lw (WHO2005 TEQ) in eggs from the two bird species, respectively. This is two orders of magnitude higher than European Union (EU) regulation limit in chicken eggs. Also, a novel pattern of polychlorinated biphenyls (PCBs) with octa- to decaCBs, contributing to as much as 20% of total PCBs therein, are reported in birds. The legacy POPs shows a common characteristic with relatively high level of organochlorine pesticides (i.e. DDT, hexacyclohexanes (HCHs) and Mirex), indicating historic applications. In contrast, rather low concentrations are shown of industrial chemicals such as PCBs and polybrominated diphenyl ethers (PBDEs). A refined and improved analytical method is developed to separate dicofol from its major decomposition compound, 4,4’-dichlorobenzophenone. Hence dicofol is possible to assess as such. Statistic power analysis demonstrates that sampling of sedentary species should be consistently spread over a larger area to monitor temporal trends of contaminants in a robust manner. The results presented in this thesis show high CPs and OCDD concentrations in wildlife. The levels and patterns of OHCs in YRD differ from other well studied areas of the world. This is likely due to the extensive production and use of chemicals in the YRD. The results strongly signal the need of research biomonitoring programs that meet the current situation of the YRD. Such programs will contribute to the management of chemicals and environment in YRD, with the potential to grow into the human health sector, and to expand to China as a whole.
Resumo:
Aims: Characterization of the representative protozoan Acanthamoeba polyphaga surface carbohydrate exposure by a novel combination of flow cytometry and ligand-receptor analysis. Methods and Results: Trophozoite and cyst morphological forms were exposed to a panel of FITC-lectins. Population fluorescence associated with FITC-lectin binding to acanthamoebal surface moieties was ascertained by flow cytometry. Increasing concentrations of representative FITC-lectins, saturation binding and determination of K d and relative Bmax values were employed to characterize carbohydrate residue exposure. FITC-lectins specific for N-acetylglucosamine, N-acetylgalactosamine and mannose/glucose were readily bound by trophozoite and cyst surfaces. Minor incremental increases in FITC-lectin concentration resulted in significant differences in surface fluorescence intensity and supported the calculation of ligand-binding determinants, Kd and relative B max, which gave a trophozoite and cyst rank order of lectin affinity and surface receptor presence. Conclusions: Trophozoites and cysts expose similar surface carbohydrate residues, foremost amongst which is N-acetylglucosamine, in varying orientation and availability. Significance and Impact of the Study: The outlined versatile combination of flow cytometry and ligand-receptor analysis allowed the characterization of surface carbohydrate exposure by protozoan morphological forms and in turn will support a valid comparison of carbohydrate exposure by other single-cell protozoa and eucaryotic microbes analysed in the same manner.
Resumo:
Purpose – Previous reviews of Corporate Social Reporting (CSR) literature have tended to focus on developed economies. The aim of this study is to extend reviews of CSR literature to emerging economies. Design/methodology/approach – A desk-based research method, using a classification framework of three categories. Findings – Most CSR studies in emerging economies have concentrated on the Asia-Pacific and African regions and are descriptive in nature, used content analysis methods and measured the extent and volume of disclosures contained within the annual reports. Such studies provide indirect explanation of the reasons behind CSR adoption, but of late, a handful of studies have started to probe managerial motivations behind CSR directly through in-depth interviews finding that CSR agendas in emerging economies are largely driven by external forces, namely pressures from parent companies, international market and international agencies.
Resumo:
A new surface analysis technique has been developed which has a number of benefits compared to conventional Low Energy Ion Scattering Spectrometry (LEISS). A major potential advantage arising from the absence of charge exchange complications is the possibility of quantification. The instrumentation that has been developed also offers the possibility of unique studies concerning the interaction between low energy ions and atoms and solid surfaces. From these studies it may also be possible, in principle, to generate sensitivity factors to quantify LEISS data. The instrumentation, which is referred to as a Time-of-Flight Fast Atom Scattering Spectrometer has been developed to investigate these conjecture in practice. The development, involved a number of modifications to an existing instrument, and allowed samples to be bombarded with a monoenergetic pulsed beam of either atoms or ions, and provided the capability to analyse the spectra of scattered atoms and ions separately. Further to this a system was designed and constructed to allow incident, exit and azimuthal angles of the particle beam to be varied independently. The key development was that of a pulsed, and mass filtered atom source; which was developed by a cyclic process of design, modelling and experimentation. Although it was possible to demonstrate the unique capabilities of the instrument, problems relating to surface contamination prevented the measurement of the neutralisation probabilities. However, these problems appear to be technical rather than scientific in nature, and could be readily resolved given the appropriate resources. Experimental spectra obtained from a number of samples demonstrate some fundamental differences between the scattered ion and neutral spectra. For practical non-ordered surfaces the ToF spectra are more complex than their LEISS counterparts. This is particularly true for helium scattering where it appears, in the absence of detailed computer simulation, that quantitative analysis is limited to ordered surfaces. Despite this limitation the ToFFASS instrument opens the way for quantitative analysis of the 'true' surface region to a wider range of surface materials.
Resumo:
This article explains first, the reasons why a knowledge of statistics is necessary and describes the role that statistics plays in an experimental investigation. Second, the normal distribution is introduced which describes the natural variability shown by many measurements in optometry and vision sciences. Third, the application of the normal distribution to some common statistical problems including how to determine whether an individual observation is a typical member of a population and how to determine the confidence interval for a sample mean is described.
Resumo:
In this second article, statistical ideas are extended to the problem of testing whether there is a true difference between two samples of measurements. First, it will be shown that the difference between the means of two samples comes from a population of such differences which is normally distributed. Second, the 't' distribution, one of the most important in statistics, will be applied to a test of the difference between two means using a simple data set drawn from a clinical experiment in optometry. Third, in making a t-test, a statistical judgement is made as to whether there is a significant difference between the means of two samples. Before the widespread use of statistical software, this judgement was made with reference to a statistical table. Even if such tables are not used, it is useful to understand their logical structure and how to use them. Finally, the analysis of data, which are known to depart significantly from the normal distribution, will be described.
Resumo:
In any investigation in optometry involving more that two treatment or patient groups, an investigator should be using ANOVA to analyse the results assuming that the data conform reasonably well to the assumptions of the analysis. Ideally, specific null hypotheses should be built into the experiment from the start so that the treatments variation can be partitioned to test these effects directly. If 'post-hoc' tests are used, then an experimenter should examine the degree of protection offered by the test against the possibilities of making either a type 1 or a type 2 error. All experimenters should be aware of the complexity of ANOVA. The present article describes only one common form of the analysis, viz., that which applies to a single classification of the treatments in a randomised design. There are many different forms of the analysis each of which is appropriate to the analysis of a specific experimental design. The uses of some of the most common forms of ANOVA in optometry have been described in a further article. If in any doubt, an investigator should consult a statistician with experience of the analysis of experiments in optometry since once embarked upon an experiment with an unsuitable design, there may be little that a statistician can do to help.
Resumo:
1. Pearson's correlation coefficient only tests whether the data fit a linear model. With large numbers of observations, quite small values of r become significant and the X variable may only account for a minute proportion of the variance in Y. Hence, the value of r squared should always be calculated and included in a discussion of the significance of r. 2. The use of r assumes that a bivariate normal distribution is present and this assumption should be examined prior to the study. If Pearson's r is not appropriate, then a non-parametric correlation coefficient such as Spearman's rs may be used. 3. A significant correlation should not be interpreted as indicating causation especially in observational studies in which there is a high probability that the two variables are correlated because of their mutual correlations with other variables. 4. In studies of measurement error, there are problems in using r as a test of reliability and the ‘intra-class correlation coefficient’ should be used as an alternative. A correlation test provides only limited information as to the relationship between two variables. Fitting a regression line to the data using the method known as ‘least square’ provides much more information and the methods of regression and their application in optometry will be discussed in the next article.
Resumo:
Multiple regression analysis is a complex statistical method with many potential uses. It has also become one of the most abused of all statistical procedures since anyone with a data base and suitable software can carry it out. An investigator should always have a clear hypothesis in mind before carrying out such a procedure and knowledge of the limitations of each aspect of the analysis. In addition, multiple regression is probably best used in an exploratory context, identifying variables that might profitably be examined by more detailed studies. Where there are many variables potentially influencing Y, they are likely to be intercorrelated and to account for relatively small amounts of the variance. Any analysis in which R squared is less than 50% should be suspect as probably not indicating the presence of significant variables. A further problem relates to sample size. It is often stated that the number of subjects or patients must be at least 5-10 times the number of variables included in the study.5 This advice should be taken only as a rough guide but it does indicate that the variables included should be selected with great care as inclusion of an obviously unimportant variable may have a significant impact on the sample size required.
Resumo:
Purpose – Previous reviews of Corporate Social Reporting (CSR) literature have tended to focus on developed economies. The aim of this study is to extend reviews of CSR literature to emerging economies. Design/methodology/approach – A desk-based research method, using a classification framework of three categories. Findings – Most CSR studies in emerging economies have concentrated on the Asia-Pacific and African regions and are descriptive in nature, used content analysis methods and measured the extent and volume of disclosures contained within the annual reports. Such studies provide indirect explanation of the reasons behind CSR adoption, but of late, a handful of studies have started to probe managerial motivations behind CSR directly through in-depth interviews finding that CSR agendas in emerging economies are largely driven by external forces, namely pressures from parent companies, international market and international agencies. Originality/value – This is the first review and analysis of CSR studies from the emerging economy perspective. Following this analysis, the authors have identified some important future research questions.
Resumo:
This paper considers the contemporary use of focus groups as a method of data collection within qualitative research settings. The authors draw upon their own experiences of using focus groups in educational and 'community' user-group environments in order to provide an overview of recent issues and debates surrounding the deployment of focus group methods and to pick out specific areas of contention in relation to both their epistemological and practical implications. Accordingly, the paper reflects on some of the realities of 'doing' focus groups whilst, at the same time, highlighting common problems and dilemmas which beginning researchers might encounter in their application. In turn, the paper raises a number of related issues around which there appears to have been a lack of academic discussion to date.
Resumo:
Report published in the Proceedings of the National Conference on "Education and Research in the Information Society", Plovdiv, May, 2016
Resumo:
A minőségügy egyik kulcsfeladata, hogy azonosítsa az értékteremtés szempontjából kritikus tényezőket, meghatározza ezek értékét, valamint intézkedjen negatív hatásuk megelőzése és csökkentése érdekében. Az értékteremtés sok esetben folyamatokon keresztül történik, amelyek tevékenységekből, elvégzendő feladatokból állnak. Ezekhez megfelelő munkatársak kellenek, akiknek az egyik legfontosabb jellemzője az általuk birtokolt tudás. Mindezek alapján a feladat-tudás-erőforrás kapcsolatrendszer ismerete és kezelése minőségügyi feladat is. A komplex rendszerek elemzésével foglalkozó hálózatkutatás eszközt biztosíthat ehhez, ezért indokolt a minőségügyi területen történő alkalmazhatóságának vizsgálata. Az alkalmazási lehetőségek rendszerezése érdekében a szerzők kategorizálták a minőségügyi hálózatokat az élek (kapcsolatok) és a csúcsok (hálózati pontok) típusai alapján. Ezt követően definiálták a multimodális (több különböző csúcstípusból álló) tudáshálózatot, amely a feladatokból, az erőforrásokból, a tudáselemekből és a közöttük lévő kapcsolatokból épül fel. A hálózat segítségével kategóriákba sorolták a tudáselemeket, valamint a fokszámok alapján meghatározták értéküket. A multimodális hálózatból képzett tudáselem-hálózatban megadták az összefüggő csoportok jelentését, majd megfogalmaztak egy összefüggést a tudáselem-elvesztés kockázatának meghatározására. _______ The aims of quality management are to identify those factors that have significant influence on value production, qualify or quantify them, and make preventive and corrective actions in order to reduce their negative effects. The core elements of value production are processes and tasks, along with workforce having the necessary knowledge to work. For that reason the task-resource-knowledge structure is pertinent to quality management. Network science provides methods to analyze complex systems; therefore it seems reasonable to study the use of tools of network analysis in association with quality management issues. First of all the authors categorized quality networks according to the types of nodes (vertices) and links (edges or arcs). Focusing on knowledge management, they defined the multimodal knowledge network, consisting of tasks, resources, knowledge items and their interconnections. Based on their degree, network nodes can be categorized and their value can be quantified. Derived from the multimodal network knowledge-item network is to be created, where the meaning of cohesive subgroups is defined. Eventually they proposed a formula for determining the risk of knowledge loss.
Resumo:
In the article - Menu Analysis: Review and Evaluation - by Lendal H. Kotschevar, Distinguished Professor School of Hospitality Management, Florida International University, Kotschevar’s initial statement reads: “Various methods are used to evaluate menus. Some have quite different approaches and give different information. Even those using quite similar methods vary in the information they give. The author attempts to describe the most frequently used methods and to indicate their value. A correlation calculation is made to see how well certain of these methods agree in the information they give.” There is more than one way to look at the word menu. The culinary selections decided upon by the head chef or owner of a restaurant, which ultimately define the type of restaurant is one way. The physical outline of the food, which a patron actually holds in his or her hand, is another. These descriptions are most common to the word, menu. The author primarily concentrates on the latter description, and uses the act of counting the number of items sold on a menu to measure the popularity of any particular item. This, along with a formula, allows Kotschevar to arrive at a specific value per item. Menu analysis would appear a difficult subject to broach. How does a person approach a menu analysis, how do you qualify and quantify a menu; it seems such a subjective exercise. The author offers methods and outlines on approaching menu analysis from empirical perspectives. “Menus are often examined visually through the evaluation of various factors. It is a subjective method but has the advantage of allowing scrutiny of a wide range of factors which other methods do not,” says Distinguished Professor, Kotschevar. “The method is also highly flexible. Factors can be given a score value and scores summed to give a total for a menu. This allows comparison between menus. If the one making the evaluations knows menu values, it is a good method of judgment,” he further offers. The author wants you to know that assigning values is fundamental to a pragmatic menu analysis; it is how the reviewer keeps score, so to speak. Value merit provides reliable criteria from which to gauge a particular menu item. In the final analysis, menu evaluation provides the mechanism for either keeping or rejecting selected items on a menu. Kotschevar provides at least three different matrix evaluation methods; they are defined as the Miller method, the Smith and Kasavana method, and the Pavesic method. He offers illustrated examples of each via a table format. These are helpful tools since trying to explain the theories behind the tables would be difficult at best. Kotschevar also references examples of analysis methods which aren’t matrix based. The Hayes and Huffman - Goal Value Analysis - is one such method. The author sees no one method better than another, and suggests that combining two or more of the methods to be a benefit.
Resumo:
The successful performance of a hydrological model is usually challenged by the quality of the sensitivity analysis, calibration and uncertainty analysis carried out in the modeling exercise and subsequent simulation results. This is especially important under changing climatic conditions where there are more uncertainties associated with climate models and downscaling processes that increase the complexities of the hydrological modeling system. In response to these challenges and to improve the performance of the hydrological models under changing climatic conditions, this research proposed five new methods for supporting hydrological modeling. First, a design of experiment aided sensitivity analysis and parameterization (DOE-SAP) method was proposed to investigate the significant parameters and provide more reliable sensitivity analysis for improving parameterization during hydrological modeling. The better calibration results along with the advanced sensitivity analysis for significant parameters and their interactions were achieved in the case study. Second, a comprehensive uncertainty evaluation scheme was developed to evaluate three uncertainty analysis methods, the sequential uncertainty fitting version 2 (SUFI-2), generalized likelihood uncertainty estimation (GLUE) and Parameter solution (ParaSol) methods. The results showed that the SUFI-2 performed better than the other two methods based on calibration and uncertainty analysis results. The proposed evaluation scheme demonstrated that it is capable of selecting the most suitable uncertainty method for case studies. Third, a novel sequential multi-criteria based calibration and uncertainty analysis (SMC-CUA) method was proposed to improve the efficiency of calibration and uncertainty analysis and control the phenomenon of equifinality. The results showed that the SMC-CUA method was able to provide better uncertainty analysis results with high computational efficiency compared to the SUFI-2 and GLUE methods and control parameter uncertainty and the equifinality effect without sacrificing simulation performance. Fourth, an innovative response based statistical evaluation method (RESEM) was proposed for estimating the uncertainty propagated effects and providing long-term prediction for hydrological responses under changing climatic conditions. By using RESEM, the uncertainty propagated from statistical downscaling to hydrological modeling can be evaluated. Fifth, an integrated simulation-based evaluation system for uncertainty propagation analysis (ISES-UPA) was proposed for investigating the effects and contributions of different uncertainty components to the total propagated uncertainty from statistical downscaling. Using ISES-UPA, the uncertainty from statistical downscaling, uncertainty from hydrological modeling, and the total uncertainty from two uncertainty sources can be compared and quantified. The feasibility of all the methods has been tested using hypothetical and real-world case studies. The proposed methods can also be integrated as a hydrological modeling system to better support hydrological studies under changing climatic conditions. The results from the proposed integrated hydrological modeling system can be used as scientific references for decision makers to reduce the potential risk of damages caused by extreme events for long-term water resource management and planning.