977 resultados para log-on
Resumo:
Many papers claim that a Log Periodic Power Law (LPPL) model fitted to financial market bubbles that precede large market falls or 'crashes', contains parameters that are confined within certain ranges. Further, it is claimed that the underlying model is based on influence percolation and a martingale condition. This paper examines these claims and their validity for capturing large price falls in the Hang Seng stock market index over the period 1970 to 2008. The fitted LPPLs have parameter values within the ranges specified post hoc by Johansen and Sornette (2001) for only seven of these 11 crashes. Interestingly, the LPPL fit could have predicted the substantial fall in the Hang Seng index during the recent global downturn. Overall, the mechanism posited as underlying the LPPL model does not do so, and the data used to support the fit of the LPPL model to bubbles does so only partially. © 2013.
Resumo:
In many of the Statnotes described in this series, the statistical tests assume the data are a random sample from a normal distribution These Statnotes include most of the familiar statistical tests such as the ‘t’ test, analysis of variance (ANOVA), and Pearson’s correlation coefficient (‘r’). Nevertheless, many variables exhibit a more or less ‘skewed’ distribution. A skewed distribution is asymmetrical and the mean is displaced either to the left (positive skew) or to the right (negative skew). If the mean of the distribution is low, the degree of variation large, and when values can only be positive, a positively skewed distribution is usually the result. Many distributions have potentially a low mean and high variance including that of the abundance of bacterial species on plants, the latent period of an infectious disease, and the sensitivity of certain fungi to fungicides. These positively skewed distributions are often fitted successfully by a variant of the normal distribution called the log-normal distribution. This statnote describes fitting the log-normal distribution with reference to two scenarios: (1) the frequency distribution of bacterial numbers isolated from cloths in a domestic environment and (2), the sizes of lichenised ‘areolae’ growing on the hypothalus of Rhizocarpon geographicum (L.) DC.
Resumo:
The article presents a new type of logs merging tool for multiple blade telecommunication systems based on the development of a new approach. The introduction of the new logs merging tool (the Log Merger) can help engineers to build a processes behavior timeline with a flexible system of information structuring used to assess the changes in the analyzed system. This logs merging system based on the experts experience and their analytical skills generates a knowledge base which could be advantageous in further decision-making expert system development. This paper proposes and discusses the design and implementation of the Log Merger, its architecture, multi-board analysis of capability and application areas. The paper also presents possible ways of further tool improvement e.g. - to extend its functionality and cover additional system platforms. The possibility to add an analysis module for further expert system development is also considered.
Resumo:
AMS subject classification: 90B80.
Resumo:
In this paper, we give several results for majorized matrices by using continuous convex function and Green function. We obtain mean value theorems for majorized matrices and also give corresponding Cauchy means, as well as prove that these means are monotonic. We prove positive semi-definiteness of matrices generated by differences deduced from majorized matrices which implies exponential convexity and log-convexity of these differences and also obtain Lypunov's and Dresher's type inequalities for these differences.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as “histogram binning” inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. ^ Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. ^ The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. ^ These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. ^ In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation. ^
Resumo:
The high concentration of underprepared students in community colleges presents a challenge to educators, policy-makers, and researchers. All have pointed to low completion rates and caution that institutional practices and policy ought to focus on improving retention and graduation rates. However, a multitude of inhibiting factors limits the educational opportunities of underprepared community college students. Using Tinto's (1993) and Astin's (1999) models of student departure as the primary theoretical framework, as well as faculty mentoring as a strategy to impact student performance and retention, the purpose of this study was to determine whether a mentoring program designed to promote greater student-faculty interactions with underprepared community college students is predictive of higher retention for such students. While many studies have documented the positive effects of faculty mentoring with 4-year university students, very few have examined faculty mentoring with underprepared community college students (Campbell and Campbell, 1997; Nora & Crisp, 2007). In this study, the content of student-faculty interactions captured during the mentoring experience was operationalized into eight domains. Faculty members used a log to record their interactions with students. During interactions they tried to help students develop study skills, set goals, and manage their time. They also provided counseling, gave encouragement, nurtured confidence, secured financial aid/grants/scholarships, and helped students navigate their first semester at college. Logistic regression results showed that both frequency and content of faculty interactions were important predictors of retention. Students with high levels of faculty interactions in the area of educational planning and personal/family concerns were more likely to persist. Those with high levels of interactions in time-management and academic concerns were less likely to persist. Interactions that focused on students' poor grades, unpreparedness for class, or excessive absences were predictive of dropping out. Those that focused on developing a program of study, creating a road map to completion, or students' self-perceptions, feelings of self-efficacy, and personal control were predictive of persistence.
Resumo:
Hydrophobicity as measured by Log P is an important molecular property related to toxicity and carcinogenicity. With increasing public health concerns for the effects of Disinfection By-Products (DBPs), there are considerable benefits in developing Quantitative Structure and Activity Relationship (QSAR) models capable of accurately predicting Log P. In this research, Log P values of 173 DBP compounds in 6 functional classes were used to develop QSAR models, by applying 3 molecular descriptors, namely, Energy of the Lowest Unoccupied Molecular Orbital (ELUMO), Number of Chlorine (NCl) and Number of Carbon (NC) by Multiple Linear Regression (MLR) analysis. The QSAR models developed were validated based on the Organization for Economic Co-operation and Development (OECD) principles. The model Applicability Domain (AD) and mechanistic interpretation were explored. Considering the very complex nature of DBPs, the established QSAR models performed very well with respect to goodness-of-fit, robustness and predictability. The predicted values of Log P of DBPs by the QSAR models were found to be significant with a correlation coefficient R2 from 81% to 98%. The Leverage Approach by Williams Plot was applied to detect and remove outliers, consequently increasing R 2 by approximately 2% to 13% for different DBP classes. The developed QSAR models were statistically validated for their predictive power by the Leave-One-Out (LOO) and Leave-Many-Out (LMO) cross validation methods. Finally, Monte Carlo simulation was used to assess the variations and inherent uncertainties in the QSAR models of Log P and determine the most influential parameters in connection with Log P prediction. The developed QSAR models in this dissertation will have a broad applicability domain because the research data set covered six out of eight common DBP classes, including halogenated alkane, halogenated alkene, halogenated aromatic, halogenated aldehyde, halogenated ketone, and halogenated carboxylic acid, which have been brought to the attention of regulatory agencies in recent years. Furthermore, the QSAR models are suitable to be used for prediction of similar DBP compounds within the same applicability domain. The selection and integration of various methodologies developed in this research may also benefit future research in similar fields.
Resumo:
This dissertation develops a new mathematical approach that overcomes the effect of a data processing phenomenon known as "histogram binning" inherent to flow cytometry data. A real-time procedure is introduced to prove the effectiveness and fast implementation of such an approach on real-world data. The histogram binning effect is a dilemma posed by two seemingly antagonistic developments: (1) flow cytometry data in its histogram form is extended in its dynamic range to improve its analysis and interpretation, and (2) the inevitable dynamic range extension introduces an unwelcome side effect, the binning effect, which skews the statistics of the data, undermining as a consequence the accuracy of the analysis and the eventual interpretation of the data. Researchers in the field contended with such a dilemma for many years, resorting either to hardware approaches that are rather costly with inherent calibration and noise effects; or have developed software techniques based on filtering the binning effect but without successfully preserving the statistical content of the original data. The mathematical approach introduced in this dissertation is so appealing that a patent application has been filed. The contribution of this dissertation is an incremental scientific innovation based on a mathematical framework that will allow researchers in the field of flow cytometry to improve the interpretation of data knowing that its statistical meaning has been faithfully preserved for its optimized analysis. Furthermore, with the same mathematical foundation, proof of the origin of such an inherent artifact is provided. These results are unique in that new mathematical derivations are established to define and solve the critical problem of the binning effect faced at the experimental assessment level, providing a data platform that preserves its statistical content. In addition, a novel method for accumulating the log-transformed data was developed. This new method uses the properties of the transformation of statistical distributions to accumulate the output histogram in a non-integer and multi-channel fashion. Although the mathematics of this new mapping technique seem intricate, the concise nature of the derivations allow for an implementation procedure that lends itself to a real-time implementation using lookup tables, a task that is also introduced in this dissertation.
Resumo:
The high concentration of underprepared students in community colleges presents a challenge to educators, policy-makers, and researchers. All have pointed to low completion rates and caution that institutional practices and policy ought to focus on improving retention and graduation rates. However, a multitude of inhibiting factors limits the educational opportunities of underprepared community college students. Using Tinto's (1993) and Astin's (1999) models of student departure as the primary theoretical framework, as well as faculty mentoring as a strategy to impact student performance and retention, the purpose of this study was to determine whether a mentoring program designed to promote greater student-faculty interactions with underprepared community college students is predictive of higher retention for such students. While many studies have documented the positive effects of faculty mentoring with 4-year university students, very few have examined faculty mentoring with underprepared community college students (Campbell and Campbell, 1997; Nora & Crisp, 2007). In this study, the content of student-faculty interactions captured during the mentoring experience was operationalized into eight domains. Faculty members used a log to record their interactions with students. During interactions they tried to help students develop study skills, set goals, and manage their time. They also provided counseling, gave encouragement, nurtured confidence, secured financial aid/grants/scholarships, and helped students navigate their first semester at college. Logistic regression results showed that both frequency and content of faculty interactions were important predictors of retention. Students with high levels of faculty interactions in the area of educational planning and personal/family concerns were more likely to persist. Those with high levels of interactions in time-management and academic concerns were less likely to persist. Interactions that focused on students' poor grades, unpreparedness for class, or excessive absences were predictive of dropping out. Those that focused on developing a program of study, creating a road map to completion, or students' self-perceptions, feelings of self-efficacy, and personal control were predictive of persistence.
Resumo:
This project is about retrieving data in range without allowing the server to read it, when the database is stored in the server. Basically, our goal is to build a database that allows the client to maintain the confidentiality of the data stored, despite all the data is stored in a different location from the client's hard disk. This means that all the information written on the hard disk can be easily read by another person who can do anything with it. Given that, we need to encrypt that data from eavesdroppers or other people. This is because they could sell it or log into accounts and use them for stealing money or identities. In order to achieve this, we need to encrypt the data stored in the hard drive, so that only the possessor of the key can easily read the information stored, while all the others are going to read only encrypted data. Obviously, according to that, all the data management must be done by the client, otherwise any malicious person can easily retrieve it and use it for any malicious intention. All the methods analysed here relies on encrypting data in transit. In the end of this project we analyse 2 theoretical and practical methods for the creation of the above databases and then we tests them with 3 datasets and with 10, 100 and 1000 queries. The scope of this work is to retrieve a trend that can be useful for future works based on this project.
Resumo:
We investigate the use of the rest-frame 24 μm luminosity as an indicator of the star formation rate (SFR) in galaxies with different metallicities by comparing it to the (extinction-corrected) Hα luminosity. We carry out this analysis in two steps: First, we compare the emission from H (II) regions in different galaxies with metallicities between 12 + and 8.9. We find that the 24 μm and the extinction-corrected Hα luminosities from individual H (II) log (O/H) = 8.1 regions follow the same correlation for all galaxies, independent of their metallicity. Second, the role of metallicity is explored further for the integrated luminosity in a sample of galaxies with metallicities in the range of 12 +. For this sample we compare the 24 μm and Hα luminosities integrated over the entire galaxies log (O/ H) = 7.2-9.1 and find a lack of the 24 μm emission for a given Hα luminosity for low-metallicity objects, likely reflecting a low dust content. These results suggest that the 24 μm luminosity is a good metallicity-independent tracer for the SFR in individual H (II) regions. On the other hand, metallicity has to be taken into account when using the 24 μm luminosity as a tracer for the SFR of entire galaxies.
Resumo:
Parking is often underpriced and expanding its capacity is expensive; universities need a better way of reducing congestion outside of building costly parking garages. Demand based pricing mechanisms, such as auctions, offer a possible solution to the problem by promising to reduce parking at peak times. However, faculty, students, and staff at universities have systematically different parking needs, leading to different parking valuations. In this study, I determine the impact university affiliation has on predicting bid values cast in three Dutch Auctions of on-campus parking permits sold at Chapman University in Fall 2010. Using clustering techniques crosschecked with university demographic information to detect affiliation groups, I ran a log-linear regression, finding that university affiliation had a larger effect on bid amount than on lot location and fraction of auction duration. Generally, faculty were predicted to have higher bids whereas students were predicted to have lower bids.
Resumo:
A pressurized core with CH4 hydrate or dissolved CH4 should evolve gas volumes in a predictable manner as pressure is released over time at isothermal conditions. Incremental gas volumes were collected as pressure was released over time from 29 pressure core sampler (PCS) cores from Sites 994, 995, 996, and 997 on the Blake Ridge. Most of these cores were kept at or near 0ºC with an ice bath, and many of these cores yielded substantial quantities of CH4. Volume-pressure plots were constructed for 20 of these cores. Only five plots conform to expected volume and pressure changes for sediment cores with CH4 hydrate under initial pressure and temperature conditions. However, other evidence suggests that sediment in these five and at least five other PCS cores contained CH4 hydrate before core recovery and gas release. Detection of CH4 hydrate in a pressurized sediment core through volume-pressure relationships is complicated by two factors. First, significant quantities of CH4-poor borehole water fill the PCS and come into contact with the core. This leads to dilution of CH4 concentration in interstitial water and, in many cases, decomposition of CH4 hydrate before a degassing experiment begins. Second, degassing experiments were conducted after the PCS had equilibrated in an ice-water bath (0ºC). This temperature is significantly lower than in situ values in the sediment formation before core recovery. Our results and interpretations for PCS cores collected on Leg 164 imply that pressurized containers formerly used by the Deep Sea Drilling Project (DSDP) and currently used by ODP are not appropriately designed for direct detection of gas hydrate in sediment at in situ conditions through volume-pressure relationships.
Resumo:
In May 1964 the Institute of Marine Science (University of Miami), Scripps Institution of Oceanography (University of California), Woods Hole Oceanographic Institution, and Lamont Geological Observatory (Columbia University) joined in the establishment of the JOINT OCEANOGRAPHIC INSTITUTIONS DEEP EARTH SAMPLING (JOIDES) program. The long range purpose of this organization is to obtain continuous core samples of the entire sedimentary column from the floors of the oceans. It was decided that initial efforts would be limited to water depths of less than 1000 fathoms (6000 feet), and tentative locations were selected for drilling operations off the eastern, western and Gulf coasts of the United States. Near the end of December 1964 it was found that the M/V Caldrill I, a drilling vessel capable of working to depths of 6000 feet, was to engage in drilling operations on the Grand Banks of Newfoundland during the summer of 1965 for the Pan American Petroleum Corporation. Thus it was agreed to organize a drilling program along the track of Caldrill between California and the Grand Banks. Selection was made of an area on the continental shelf and the Blake Plateau off Jacksonville, Florida. Based upon many previous geological and geophysical investigations by the participating laboratories, a considerable body of knowledge had been gained about this region of the continental-oceanic border. For this initial program of JOIDES, the Lamont Geological Observatory was chosen as the operating institution with J. L. Worzel as principal investigator, and C. L. Drake and H. A. Gibbon as program planners.