974 resultados para Statistical Information on Recidivism (SIR)


Relevância:

100.00% 100.00%

Publicador:

Resumo:

A wide range of models used in agriculture, ecology, carbon cycling, climate and other related studies require information on the amount of leaf material present in a given environment to correctly represent radiation, heat, momentum, water, and various gas exchanges with the overlying atmosphere or the underlying soil. Leaf area index (LAI) thus often features as a critical land surface variable in parameterisations of global and regional climate models, e.g., radiation uptake, precipitation interception, energy conversion, gas exchange and momentum, as all areas are substantially determined by the vegetation surface. Optical wavelengths of remote sensing are the common electromagnetic regions used for LAI estimations and generally for vegetation studies. The main purpose of this dissertation was to enhance the determination of LAI using close-range remote sensing (hemispherical photography), airborne remote sensing (high resolution colour and colour infrared imagery), and satellite remote sensing (high resolution SPOT 5 HRG imagery) optical observations. The commonly used light extinction models are applied at all levels of optical observations. For the sake of comparative analysis, LAI was further determined using statistical relationships between spectral vegetation index (SVI) and ground based LAI. The study areas of this dissertation focus on two regions, one located in Taita Hills, South-East Kenya characterised by tropical cloud forest and exotic plantations, and the other in Gatineau Park, Southern Quebec, Canada dominated by temperate hardwood forest. The sampling procedure of sky map of gap fraction and size from hemispherical photographs was proven to be one of the most crucial steps in the accurate determination of LAI. LAI and clumping index estimates were significantly affected by the variation of the size of sky segments for given zenith angle ranges. On sloping ground, gap fraction and size distributions present strong upslope/downslope asymmetry of foliage elements, and thus the correction and the sensitivity analysis for both LAI and clumping index computations were demonstrated. Several SVIs can be used for LAI mapping using empirical regression analysis provided that the sensitivities of SVIs at varying ranges of LAI are large enough. Large scale LAI inversion algorithms were demonstrated and were proven to be a considerably efficient alternative approach for LAI mapping. LAI can be estimated nonparametrically from the information contained solely in the remotely sensed dataset given that the upper-end (saturated SVI) value is accurately determined. However, further study is still required to devise a methodology as well as instrumentation to retrieve on-ground green leaf area index . Subsequently, the large scale LAI inversion algorithms presented in this work can be precisely validated. Finally, based on literature review and this dissertation, potential future research prospects and directions were recommended.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

With few exceptions, the bulk of the collection pertains to the work of the Agro-Joint. Records of the Agro-Joint Director General. Agreements of the American Relief Administration (ARA) and the Joint Distribution Committee with the Soviet government, 1922-1923. Agreements between the Agro-Joint and the Soviet government, 1924, 1927, 1928. Agreements of the Agro-Joint and the American Society for Jewish Farm Settlements (ASJFS) with the Soviet government, 1929, 1930, 1933, 1938. Materials relating to relief work of the JDC within the framework of the American Relief Administration, 1922, including the appointment of J. Rosen as the JDC representative at the ARA. Statistics, reports, miscellaneous correspondence relating to JDC activities in Russia. Minutes, memos, reports, legal documents, certificate of incorporation, and general correspondence relating to the ASJFS, its formation, fund-raising activities, 1927-1939. Records of the Agro-Joint Main Office, Moscow. Annual and periodi c reports of the Agro-Joint including statistics, financial estimates, financial reports, analyses of expenditures, relating to Agro-Joint work, 1924-1937. General correspondence files: incoming and outgoing letters, reports, and memoranda. Materials relating to land surveys and allocations in the Crimea: statistics, surveys, memos, correspondence, relating to the Salsk district, Chernomor district, Changar peninsula, Azov, Kuban, Odessa district, Samara district, Povolzhe, Krivoy Rog, Kherson, The Far East, Siberia. Materials relating to contacts with KOMZET. Correspondence, minutes of KOMZET meetings, statistical information, reports. By-laws of the OZET (Obshchestvo po Zemleustroystvu Trudyachtchikhsya Evreev - Association For the Settlement of Toiling Jews On Land) and AGRO-KUSTBANK (Evreysky Agrarno-Kustarny Bank - Jewish Agricultural and House Workers Bank). Register of Agro-Joint assets transferred to KOMZET. Records of the Agro-Joint Agricultural Department. Materials

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Collection consists of several versions of the constitution; minute books of the membership meetings (1852-1856, 1868-1907, 1914-1971; until 1907 in German, afterwards in English); minute books of meetings of the trustees (1852-1858, 1876-1974, until 1912 in German); an index to and summary of the trustees minutes (1927-1944); several anniversary journals starting with the 50th, which was also "the first extant history of the Noah Benevolent Society"; membership books (1861-1892, 1930-1965, until 1892 in German; the books after 1930 contain detailed information concerning each member's age, occupation, family, military service, etc.); financial records (1862-1870, 1964-1967, 1972); quarterly accountant's reports (bound with the membership minutes); monthly financial and statistical reports of the Mordechai Federal Credit Union (March 1959-June 1960) established by the Society; lists and addresses of members; newsletters (1927-1979) and other material and photographs reflecting the Society's activities.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Metabolism is the cellular subsystem responsible for generation of energy from nutrients and production of building blocks for larger macromolecules. Computational and statistical modeling of metabolism is vital to many disciplines including bioengineering, the study of diseases, drug target identification, and understanding the evolution of metabolism. In this thesis, we propose efficient computational methods for metabolic modeling. The techniques presented are targeted particularly at the analysis of large metabolic models encompassing the whole metabolism of one or several organisms. We concentrate on three major themes of metabolic modeling: metabolic pathway analysis, metabolic reconstruction and the study of evolution of metabolism. In the first part of this thesis, we study metabolic pathway analysis. We propose a novel modeling framework called gapless modeling to study biochemically viable metabolic networks and pathways. In addition, we investigate the utilization of atom-level information on metabolism to improve the quality of pathway analyses. We describe efficient algorithms for discovering both gapless and atom-level metabolic pathways, and conduct experiments with large-scale metabolic networks. The presented gapless approach offers a compromise in terms of complexity and feasibility between the previous graph-theoretic and stoichiometric approaches to metabolic modeling. Gapless pathway analysis shows that microbial metabolic networks are not as robust to random damage as suggested by previous studies. Furthermore the amino acid biosynthesis pathways of the fungal species Trichoderma reesei discovered from atom-level data are shown to closely correspond to those of Saccharomyces cerevisiae. In the second part, we propose computational methods for metabolic reconstruction in the gapless modeling framework. We study the task of reconstructing a metabolic network that does not suffer from connectivity problems. Such problems often limit the usability of reconstructed models, and typically require a significant amount of manual postprocessing. We formulate gapless metabolic reconstruction as an optimization problem and propose an efficient divide-and-conquer strategy to solve it with real-world instances. We also describe computational techniques for solving problems stemming from ambiguities in metabolite naming. These techniques have been implemented in a web-based sofware ReMatch intended for reconstruction of models for 13C metabolic flux analysis. In the third part, we extend our scope from single to multiple metabolic networks and propose an algorithm for inferring gapless metabolic networks of ancestral species from phylogenetic data. Experimenting with 16 fungal species, we show that the method is able to generate results that are easily interpretable and that provide hypotheses about the evolution of metabolism.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Analyzing statistical dependencies is a fundamental problem in all empirical science. Dependencies help us understand causes and effects, create new scientific theories, and invent cures to problems. Nowadays, large amounts of data is available, but efficient computational tools for analyzing the data are missing. In this research, we develop efficient algorithms for a commonly occurring search problem - searching for the statistically most significant dependency rules in binary data. We consider dependency rules of the form X->A or X->not A, where X is a set of positive-valued attributes and A is a single attribute. Such rules describe which factors either increase or decrease the probability of the consequent A. A classical example are genetic and environmental factors, which can either cause or prevent a disease. The emphasis in this research is that the discovered dependencies should be genuine - i.e. they should also hold in future data. This is an important distinction from the traditional association rules, which - in spite of their name and a similar appearance to dependency rules - do not necessarily represent statistical dependencies at all or represent only spurious connections, which occur by chance. Therefore, the principal objective is to search for the rules with statistical significance measures. Another important objective is to search for only non-redundant rules, which express the real causes of dependence, without any occasional extra factors. The extra factors do not add any new information on the dependence, but can only blur it and make it less accurate in future data. The problem is computationally very demanding, because the number of all possible rules increases exponentially with the number of attributes. In addition, neither the statistical dependency nor the statistical significance are monotonic properties, which means that the traditional pruning techniques do not work. As a solution, we first derive the mathematical basis for pruning the search space with any well-behaving statistical significance measures. The mathematical theory is complemented by a new algorithmic invention, which enables an efficient search without any heuristic restrictions. The resulting algorithm can be used to search for both positive and negative dependencies with any commonly used statistical measures, like Fisher's exact test, the chi-squared measure, mutual information, and z scores. According to our experiments, the algorithm is well-scalable, especially with Fisher's exact test. It can easily handle even the densest data sets with 10000-20000 attributes. Still, the results are globally optimal, which is a remarkable improvement over the existing solutions. In practice, this means that the user does not have to worry whether the dependencies hold in future data or if the data still contains better, but undiscovered dependencies.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The development of fishery indicators is a crucial undertaking as it ultimately provides evidence to stakeholders about the status of fished species such as population size and survival rates. In Queensland, as in many other parts of the world, age-abundance indicators (e.g. fish catch rate and/or age composition data) are traditionally used as the evidence basis because they provide information on species life history traits as well as on changes in fishing pressures and population sizes. Often, however, the accuracy of the information from age-abundance indicators can be limited due to missing or biased data. Consequently, improved statistical methods are required to enhance the accuracy, precision and decision-support value of age-abundance indicators.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The National Energy Efficient Building Project (NEEBP) Phase One report, published in December 2014, investigated “process issues and systemic failures” in the administration of the energy performance requirements in the National Construction Code. It found that most stakeholders believed that under-compliance with these requirements is widespread across Australia, with similar issues being reported in all states and territories. The report found that many different factors were contributing to this outcome and, as a result, many recommendations were offered that together would be expected to remedy the systemic issues reported. To follow up on this Phase 1 report, three additional projects were commissioned as part of Phase 2 of the overall NEEBP project. This Report deals with the development and piloting of an Electronic Building Passport (EBP) tool – a project undertaken jointly by pitt&sherry and a team at the Queensland University of Technology (QUT) led by Dr Wendy Miller. The other Phase 2 projects cover audits of Class 1 buildings and issues relating to building alterations and additions. The passport concept aims to provide all stakeholders with (controlled) access to the key documentation and information that they need to verify the energy performance of buildings. This trial project deals with residential buildings but in principle could apply to any building type. Nine councils were recruited to help develop and test a pilot electronic building passport tool. The participation of these councils – across all states – enabled an assessment of the extent to which these councils are currently utilising documentation; to track the compliance of residential buildings with the energy performance requirements in the National Construction Code (NCC). Overall we found that none of the participating councils are currently compiling all of the energy performance-related documentation that would demonstrate code compliance. The key reasons for this include: a major lack of clarity on precisely what documentation should be collected; cost and budget pressures; low public/stakeholder demand for the documentation; and a pragmatic judgement that non-compliance with any regulated documentation requirements represents a relatively low risk for them. Some councils reported producing documentation, such as certificates of final completion, only on demand, for example. Only three of the nine council participants reported regularly conducting compliance assessments or audits utilising this documentation and/or inspections. Overall we formed the view that documentation and information tracking processes operating within the building standards and compliance system are not working to assure compliance with the Code’s energy performance requirements. In other words the Code, and its implementation under state and territory regulatory processes, is falling short as a ‘quality assurance’ system for consumers. As a result it is likely that the new housing stock is under-performing relative to policy expectations, consuming unnecessary amounts of energy, imposing unnecessarily high energy bills on occupants, and generating unnecessary greenhouse gas emissions. At the same time, Councils noted that the demand for documentation relating to building energy performance was low. All the participant councils in the EBP pilot agreed that documentation and information processes need to work more effectively if the potential regulatory and market drivers towards energy efficient homes are to be harnessed. These findings are fully consistent with the Phase 1 NEEBP report. It was also agreed that an EBP system could potentially play an important role in improving documentation and information processes. However, only one of the participant councils indicated that they might adopt such a system on a voluntary basis. The majority felt that such a system would only be taken up if it were: - A nationally agreed system, imposed as a mandatory requirement under state or national regulation; - Capable of being used by multiple parties including councils, private certifiers, building regulators, builders and energy assessors in particular; and - Fully integrated into their existing document management systems, or at least seamlessly compatible rather than a separate, unlinked tool. Further, we note that the value of an EBP in capturing statistical information relating to the energy performance of buildings would be much greater if an EBP were adopted on a nationally consistent basis. Councils were clear that a key impediment to the take up of an EBP system is that they are facing very considerable budget and staffing challenges. They report that they are often unable to meet all community demands from the resources available to them. Therefore they are unlikely to provide resources to support the roll out of an EBP system on a voluntary basis. Overall, we conclude from this pilot that the public good would be well served if the Australian, state and territory governments continued to develop and implement an Electronic Building Passport system in a cost-efficient and effective manner. This development should occur with detailed input from building regulators, the Australian Building Codes Board (ABCB), councils and private certifiers in the first instance. This report provides a suite of recommendations (Section 7.2) designed to advance the development and guide the implementation of a national EBP system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of social media has spread into many different areas including marketing, customer service, and corporate disclosure. However, our understanding of the timely effect of financial reporting information on Twitter is still limited. In this paper, we propose to examine the timely effect of financial reporting information on Twitter in Australian context, as reflect in the stock market trading. We aim to find out whether the level of information asymmetry within the stock market will be reduced, after the introduction of Twitter and the use of Twitter for financial reporting purpose

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The use of social media has spread into many different areas including marketing, customer service, and corporate disclosure. However, our understanding of the timely effect of financial reporting information on Twitter is still limited. In this paper, we examine the timely effect of financial reporting information on Twitter in the Australian context, as reflected in the follow-up stock market reaction. With the use of event methodology and comparative setting, we find that financial reporting disclosure on Twitter reduces the information asymmetry level. This is evidenced by reduction of bid-ask spread and increase of share trading volume. The results of this study imply that financial reporting disclosure on social media assists the dissemination of information and the stock market response to this information

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Early detection of (pre-)signs of ulceration on a diabetic foot is valuable for clinical practice. Hyperspectral imaging is a promising technique for detection and classification of such (pre-)signs. However, the number of the spectral bands should be limited to avoid overfitting, which is critical for pixel classification with hyperspectral image data. The goal was to design a detector/classifier based on spectral imaging (SI) with a small number of optical bandpass filters. The performance and stability of the design were also investigated. The selection of the bandpass filters boils down to a feature selection problem. A dataset was built, containing reflectance spectra of 227 skin spots from 64 patients, measured with a spectrometer. Each skin spot was annotated manually by clinicians as "healthy" or a specific (pre-)sign of ulceration. Statistical analysis on the data set showed the number of required filters is between 3 and 7, depending on additional constraints on the filter set. The stability analysis revealed that shot noise was the most critical factor affecting the classification performance. It indicated that this impact could be avoided in future SI systems with a camera sensor whose saturation level is higher than 106, or by postimage processing.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The information that the economic agents have and regard relevant to their decision making is often assumed to be exogenous in economics. It is assumed that the agents either poses or can observe the payoff relevant information without having to exert any effort to acquire it. In this thesis we relax the assumption of ex-ante fixed information structure and study what happens to the equilibrium behavior when the agents must also decide what information to acquire and when to acquire it. This thesis addresses this question in the two essays on herding and two essays on auction theory. In the first two essays, that are joint work with Klaus Kultti, we study herding models where it is costly to acquire information on the actions that the preceding agents have taken. In our model the agents have to decide both the action that they take and additionally the information that they want to acquire by observing their predecessors. We characterize the equilibrium behavior when the decision to observe preceding agents' actions is endogenous and show how the equilibrium outcome may differ from the standard model, where all preceding agents actions are assumed to be observable. In the latter part of this thesis we study two dynamic auctions: the English and the Dutch auction. We consider a situation where bidder(s) are uninformed about their valuations for the object that is put up for sale and they may acquire this information for a small cost at any point during the auction. We study the case of independent private valuations. In the third essay of the thesis we characterize the equilibrium behavior in an English auction when there are informed and uninformed bidders. We show that the informed bidder may jump bid and signal to the uninformed that he has a high valuation, thus deterring the uninformed from acquiring information and staying in the auction. The uninformed optimally acquires information once the price has passed a particular threshold and the informed has not signalled that his valuation is high. In addition, we provide an example of an information structure where the informed bidder initially waits and then makes multiple jumps. In the fourth essay of this thesis we study the Dutch auction. We consider two cases where all bidders are all initially uninformed. In the first case the information acquisition cost is the same across all bidders and in the second also the cost of information acquisition is independently distributed and private information to the bidders. We characterize a mixed strategy equilibrium in the first and a pure strategy equilibrium in the second case. In addition we provide a conjecture of an equilibrium in an asymmetric situation where there is one informed and one uninformed bidder. We compare the revenues that the first price auction and the Dutch auction generate and we find that under some circumstances the Dutch auction outperforms the first price sealed bid auction. The usual first price sealed bid auction and the Dutch auction are strategically equivalent. However, this equivalence breaks down in case information is acquired during the auction.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Transposons are mobile elements of genetic material that are able to move in the genomes of their host organisms using a special form of recombination called transposition. Bacteriophage Mu was the first transposon for which a cell-free in vitro transposition reaction was developed. Subsequently, the reaction has been refined and the minimal Mu in vitro reaction is useful in the generation of comprehensive libraries of mutant DNA molecules that can be used in a variety of applications. To date, the functional genetics applications of Mu in vitro technology have been subjected to either plasmids or genomic regions and entire genomes of viruses cloned on specific vectors. This study expands the use of Mu in vitro transposition in functional genetics and genomics by describing novel methods applicable to the targeted transgenesis of mouse and the whole-genome analysis of bacteriophages. The methods described here are rapid, efficient, and easily applicable to a wide variety of organisms, demonstrating the potential of the Mu transposition technology in the functional analysis of genes and genomes. First, an easy-to-use, rapid strategy to generate construct for the targeted mutagenesis of mouse genes was developed. To test the strategy, a gene encoding a neuronal K+/Cl- cotransporter was mutagenised. After a highly efficient transpositional mutagenesis, the gene fragments mutagenised were cloned into a vector backbone and transferred into bacterial cells. These constructs were screened with PCR using an effective 3D matrix system. In addition to traditional knock-out constructs, the method developed yields hypomorphic alleles that lead into reduced expression of the target gene in transgenic mice and have since been used in a follow-up study. Moreover, a scheme is devised to rapidly produce conditional alleles from the constructs produced. Next, an efficient strategy for the whole-genome analysis of bacteriophages was developed based on the transpositional mutagenesis of uncloned, infective virus genomes and their subsequent transfer into susceptible host cells. Mutant viruses able to produce viable progeny were collected and their transposon integration sites determined to map genomic regions nonessential to the viral life cycle. This method, applied here to three very different bacteriophages, PRD1, ΦYeO3 12, and PM2, does not require the target genome to be cloned and is directly applicable to all DNA and RNA viruses that have infective genomes. The method developed yielded valuable novel information on the three bacteriophages studied and whole-genome data can be complemented with concomitant studies on individual genes. Moreover, end-modified transposons constructed for this study can be used to manipulate genomes devoid of suitable restriction sites.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Cancer is a leading cause of death worldwide and the total number of cancer cases continues to increase. Many cancers, for example sinonasal cancer and lung cancer, have clear external risk factors and so are potentially preventable. The occurrence of sinonasal cancer is strongly associated with wood dust exposure and the main risk factor for lung cancer is tobacco smoking. Although the molecular mechanisms involved in lung carcinogenesis have been widely studied, very little is known about the molecular changes leading to sinonasal cancer. In this work, mutations in the tumour suppressor TP53 gene in cases of sinonasal cancer and lung cancer and the associations of these mutations with exposure factors were studied. In addition, another important mechanism in many cancers, inflammation, was explored by analyzing the expression of the inflammation related enzyme, COX-2, in sinonasal cancer. The results demonstrate that TP53 mutations are frequent in sinonasal cancer and lung cancer and in both cancers they are associated with exposure. In sinonasal cancer, the occurrence of TP53 mutation significantly increased in relation to long duration and high level of exposure to wood dust. Smoking was not associated with the overall occurrence of the TP53 mutation in sinonasal cancer, but was associated with multiple TP53 mutations. Furthermore, inflammation appears to play a part in sinonasal carcinogenesis as indicated by our results showing that the expression of COX-2 was associated with adenocarcinoma type of tumours, wood dust exposure and non-smoking. In lung cancer, we detected statistically significant associations between TP53 mutations and duration of smoking, gender and histology. We also found that patients with a tumour carrying a G to T transversion, a mutation commonly found in association with tobacco smoking, had a high level of smoking-related bulky DNA adducts in their non-tumorous lung tissue. Altogether, the information on molecular changes in exposure induced cancers adds to the observations from epidemiological studies and helps to understand the role and impact of different etiological factors, which in turn can be beneficial for risk assessment and prevention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The information on altitude distribution of aerosols in the atmosphere is essential in assessing the impact of aerosol warming on thermal structure and stability of the atmosphere.In addition, aerosol altitude distribution is needed to address complex problems such as the radiative interaction of aerosols in the presence of clouds. With this objective,an extensive, multi-institutional and multi-platform field experiment (ICARB-Integrated Campaign for Aerosols, gases and Radiation Budget) was carried out under the Geosphere Biosphere Programme of the Indian Space Research Organization (ISRO-GBP) over continental India and adjoining oceans during March to May 2006. Here, we present airborne LIDAR measurements carried out over the east Coast of the India during the ICARB field campaign. An increase in aerosol extinction (scattering + absorption) was observed from the surface upwards with a maximum around 2 to 4 km. Aerosol extinction at higher atmospheric layers (>2 km) was two to three times larger compared to that of the surface. A large fraction (75-85%) of aerosol column optical depth was contributed by aerosols located above 1 km. The aerosol layer heights (defined in this paper as the height at which the gradient in extinction coefficient changes sign) showed a gradual decrease with an increase in the offshore distance. A large fraction (60-75%) of aerosol was found located above clouds indicating enhanced aerosol absorption above clouds. Our study implies that a detailed statistical evaluation of the temporal frequency and spatial extent of elevated aerosol layers is necessary to assess their significance to the climate. This is feasible using data from space-borne lidars such as CALIPSO,which fly in formation with other satellites like MODIS AQUA and MISR, as part of the A-Train constellation.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

PURPOSE To study the utility of fractional calculus in modeling gradient-recalled echo MRI signal decay in the normal human brain. METHODS We solved analytically the extended time-fractional Bloch equations resulting in five model parameters, namely, the amplitude, relaxation rate, order of the time-fractional derivative, frequency shift, and constant offset. Voxel-level temporal fitting of the MRI signal was performed using the classical monoexponential model, a previously developed anomalous relaxation model, and using our extended time-fractional relaxation model. Nine brain regions segmented from multiple echo gradient-recalled echo 7 Tesla MRI data acquired from five participants were then used to investigate the characteristics of the extended time-fractional model parameters. RESULTS We found that the extended time-fractional model is able to fit the experimental data with smaller mean squared error than the classical monoexponential relaxation model and the anomalous relaxation model, which do not account for frequency shift. CONCLUSIONS We were able to fit multiple echo time MRI data with high accuracy using the developed model. Parameters of the model likely capture information on microstructural and susceptibility-induced changes in the human brain.