853 resultados para Universities and colleges -- Australia -- entrance requirements -- Data processing
Resumo:
The MAREDAT atlas covers 11 types of plankton, ranging in size from bacteria to jellyfish. Together, these plankton groups determine the health and productivity of the global ocean and play a vital role in the global carbon cycle. Working within a uniform and consistent spatial and depth grid (map) of the global ocean, the researchers compiled thousands and tens of thousands of data points to identify regions of plankton abundance and scarcity as well as areas of data abundance and scarcity. At many of the grid points, the MAREDAT team accomplished the difficult conversion from abundance (numbers of organisms) to biomass (carbon mass of organisms). The MAREDAT atlas provides an unprecedented global data set for ecological and biochemical analysis and modeling as well as a clear mandate for compiling additional existing data and for focusing future data gathering efforts on key groups in key areas of the ocean. This is a gridded data product about diazotrophic organisms . There are 6 variables. Each variable is gridded on a dimension of 360 (longitude) * 180 (latitude) * 33 (depth) * 12 (month). The first group of 3 variables are: (1) number of biomass observations, (2) biomass, and (3) special nifH-gene-based biomass. The second group of 3 variables is same as the first group except that it only grids non-zero data. We have constructed a database on diazotrophic organisms in the global pelagic upper ocean by compiling more than 11,000 direct field measurements including 3 sub-databases: (1) nitrogen fixation rates, (2) cyanobacterial diazotroph abundances from cell counts and (3) cyanobacterial diazotroph abundances from qPCR assays targeting nifH genes. Biomass conversion factors are estimated based on cell sizes to convert abundance data to diazotrophic biomass. Data are assigned to 3 groups including Trichodesmium, unicellular diazotrophic cyanobacteria (group A, B and C when applicable) and heterocystous cyanobacteria (Richelia and Calothrix). Total nitrogen fixation rates and diazotrophic biomass are calculated by summing the values from all the groups. Some of nitrogen fixation rates are whole seawater measurements and are used as total nitrogen fixation rates. Both volumetric and depth-integrated values were reported. Depth-integrated values are also calculated for those vertical profiles with values at 3 or more depths.
Resumo:
This paper develops a simple model of the post-secondary education system in Canada that provides a useful basis for thinking about issues of capacity and access. It uses a supply-demand framework, where demand comes on the part of individuals wanting places in the system, and supply is determined not only by various directives and agreements between educational ministries and institutions (and other factors), but also the money available to universities and colleges through tuition fees. The supply and demand curves are then put together with a stylised tuition-setting rule to describe the “market” of post-secondary schooling. This market determines the number of students in the system, and their characteristics, especially as they relate to “ability” and family background, the latter being especially relevant to access issues. The manner in which various changes in the system – including tuition fees, student financial aid, government support for institutions, and the returns to schooling – are then discussed in terms of how they affect the number of students and their characteristics, or capacity and access.
Resumo:
When we study the variables that a ffect survival time, we usually estimate their eff ects by the Cox regression model. In biomedical research, e ffects of the covariates are often modi ed by a biomarker variable. This leads to covariates-biomarker interactions. Here biomarker is an objective measurement of the patient characteristics at baseline. Liu et al. (2015) has built up a local partial likelihood bootstrap model to estimate and test this interaction e ffect of covariates and biomarker, but the R code developed by Liu et al. (2015) can only handle one variable and one interaction term and can not t the model with adjustment to nuisance variables. In this project, we expand the model to allow adjustment to nuisance variables, expand the R code to take any chosen interaction terms, and we set up many parameters for users to customize their research. We also build up an R package called "lplb" to integrate the complex computations into a simple interface. We conduct numerical simulation to show that the new method has excellent fi nite sample properties under both the null and alternative hypothesis. We also applied the method to analyze data from a prostate cancer clinical trial with acid phosphatase (AP) biomarker.
Resumo:
The substantive legislation on which Agricultural Processing Companies is based has some notable gaps with regard to the pertinent accounting system. There are grey areas concerning compulsory accounting records and their legalization, together with the process for drawing up, checking, approving and depositing the annual accounts.Consequently, in this paper, we will look first at the corporate and accounting records for Agricultural Processing Companies, putting forward proposals in the wake of recent legislation on the legalization of generally applied corporate and accounting documents.A critical analysis will also be made of the entire process of drafting, auditing, approving and depositing the annual accounts and other documents that Agricultural Processing Companies must send each year to their respective regional registries. Legal and mercantile registries will be differentiated from administrative ones and, in this last sense, changes will be suggested with regard to the place and objective of the deposit of such documents.After thirty-four years old, the substantive legislation in economic and accounting matters of the SAT is out of step with the current law, so a review is necessary. Recent regional regulations have not been a real breakthrough in this regard. We assert the existence of a gap between the substantive rules of the SAT and general accounting rules on financial statements, which is unsustainable and it needs a quick legislative action to be canceled.
Resumo:
Numerical modelling and simulations are needed to develop and test specific analysis methods by providing test data before BIRDY would be launched. This document describes the "satellite data simulator" which is a multi-sensor, multi-spectral satellite simulator produced especially for the BIRDY mission which could be used as well to analyse data from other satellite missions providing energetic particles data in the Solar system.
Resumo:
Objectivo: o presente estudo pretende caracterizar a qualidade de vida dos idosos da Região de Leiria, comparando aqueles que vivem no Domicílio com os que vivem em Instituições. Para tal propomos caracterizar a população em estudo sóciodemograficamente; identificar factores situacionais consoante o seu local de residência; avaliar níveis de dependência , apoio social e funcionalidade familiar; avaliar a qualidade de vida e identificar a relação entre as várias variáveis e a qualidade de vida. Método: Para tal optou-se por passar um questionário a um total de 238 idosos, 111 residentes em Instituições e 127 residentes no domicílio. Ao longo do processo de recolha de dados foram cumpridas as exigências éticas que pautam a nossa profissão. Foram utilizados métodos de estatística descritiva e de estatística analítica para o tratamento de dados. Resultados: Os resultados obtidos permitiram a caracterização sócio-demográfica dos idosos da região de Leiria. Foi ainda possível comparar os dois grupos em estudo, não se tendo encontrado diferenças significativas entre os dois grupos para as variáveis biopsicossociais. Conclusão: A maioria dos idosos inquiridos tem qualidade de vida, sendo que os idosos residentes no domicílio apresentam maior qualidade de vida. /
Resumo:
Thesis (Ph.D.)--University of Washington, 2016-08
Resumo:
The workshop will invite participants to engage in a discussion of the characteristics of outstanding leadership by taking part in an interactive activity which we have developed and used in different types of schools and colleges in England. The activity uses Q-methodology to develop and refine characteristics of outstanding leaders and outstanding leadership in education from a range of stakeholder perspectives. Q-methodology is a research method which originates from psychology and is used to study people's subjective viewpoints. We are applying the methodology to the study of enacted leadership practice in different educational contexts. Our sample of stakeholders consists of school and college leaders, governors, middle leaders, teachers, teacher educators, researchers and scholars in educational leadership and management research and practice. The range of contexts in which they work represents different age phases of education; primary, secondary and further education colleges, urban and rural schools and colleges and selective and non-selective schools. In the workshop participants will be invited to take part in the Q-sort activity we have used with in our research, using statements from leadership theory and practice. The Q-sort will be followed by discussion and reflection on the statements in relation to participants’ own experiences of leadership, management and governance in different contexts.
Resumo:
This dissertation contains four essays that all share a common purpose: developing new methodologies to exploit the potential of high-frequency data for the measurement, modeling and forecasting of financial assets volatility and correlations. The first two chapters provide useful tools for univariate applications while the last two chapters develop multivariate methodologies. In chapter 1, we introduce a new class of univariate volatility models named FloGARCH models. FloGARCH models provide a parsimonious joint model for low frequency returns and realized measures, and are sufficiently flexible to capture long memory as well as asymmetries related to leverage effects. We analyze the performances of the models in a realistic numerical study and on the basis of a data set composed of 65 equities. Using more than 10 years of high-frequency transactions, we document significant statistical gains related to the FloGARCH models in terms of in-sample fit, out-of-sample fit and forecasting accuracy compared to classical and Realized GARCH models. In chapter 2, using 12 years of high-frequency transactions for 55 U.S. stocks, we argue that combining low-frequency exogenous economic indicators with high-frequency financial data improves the ability of conditionally heteroskedastic models to forecast the volatility of returns, their full multi-step ahead conditional distribution and the multi-period Value-at-Risk. Using a refined version of the Realized LGARCH model allowing for time-varying intercept and implemented with realized kernels, we document that nominal corporate profits and term spreads have strong long-run predictive ability and generate accurate risk measures forecasts over long-horizon. The results are based on several loss functions and tests, including the Model Confidence Set. Chapter 3 is a joint work with David Veredas. We study the class of disentangled realized estimators for the integrated covariance matrix of Brownian semimartingales with finite activity jumps. These estimators separate correlations and volatilities. We analyze different combinations of quantile- and median-based realized volatilities, and four estimators of realized correlations with three synchronization schemes. Their finite sample properties are studied under four data generating processes, in presence, or not, of microstructure noise, and under synchronous and asynchronous trading. The main finding is that the pre-averaged version of disentangled estimators based on Gaussian ranks (for the correlations) and median deviations (for the volatilities) provide a precise, computationally efficient, and easy alternative to measure integrated covariances on the basis of noisy and asynchronous prices. Along these lines, a minimum variance portfolio application shows the superiority of this disentangled realized estimator in terms of numerous performance metrics. Chapter 4 is co-authored with Niels S. Hansen, Asger Lunde and Kasper V. Olesen, all affiliated with CREATES at Aarhus University. We propose to use the Realized Beta GARCH model to exploit the potential of high-frequency data in commodity markets. The model produces high quality forecasts of pairwise correlations between commodities which can be used to construct a composite covariance matrix. We evaluate the quality of this matrix in a portfolio context and compare it to models used in the industry. We demonstrate significant economic gains in a realistic setting including short selling constraints and transaction costs.
Resumo:
Las brechas de desarrollo entre los niños de diversos fondos socioeconómicos emergen temprano y persisten en un cierto plazo. La formación cognoscitiva de la habilidad es un proceso acumulativo y, así, todas las influencias relevantes que ocurrieron hasta que se mide la habilidad del tiempo pueden desempeñar papel en formar estas brechas. Las descomposiciones lineares basadas en la técnica de la Oaxaca-Blinder son una manera bastante común de estimar la contribución de dos o más categorías de variables a estas diferencias en el logro cognoscitivo. Dos ejemplos prominentes de estas categorías son influencias de la familia y de la escuela. Al respeto, los objetos expuestos de la literatura no tienen ningún consenso en términos de estrategia de la descomposición y la interpretación de sus componentes, tan bien como una tendencia a separar influencias del hogar y de la escuela asignando todas las características observadas de la casa, familia y del niño a la primera categoría. Esto puedo conducir a las implicaciones engañosas de la política y a los diagonales en las contribuciones estimadas de las categorías. Este análisis intenta contribuir a la literatura de dos maneras. Primero, explora formalmente el potencial para los diagonales en los ejercicios de la descomposición procurados hasta ahora. En segundo lugar, ofrece una estrategia alternativa de la descomposición constante con supuestos del comportamiento explícitas con respecto a la determinación de las entradas de la habilidad. Esto previene opciones arbitrarias en términos de técnica de la descomposición, sus componentes e interpretación, y también hace los diagonales menos propensos del análisis. Ilustro de manera empírica los puntos principales del análisis que emplea un dataset que contenga la información longitudinal sobre cuentas de la prueba, familia y características cognoscitivas de la escuela, para descomponer la brecha cognoscitiva de la habilidad observada, en la edad de 8 años entre los niños urbanos y rurales en Perú.
Resumo:
p.33-44
Resumo:
This research has as an objective to study the IT Governance in the Brazilian Federal Universities, discusses the relationships between the IT Governance (ITG) mechanisms and the noticed IT management development in those public institutions. The subject Information Technology Governance, is not only vast, but constitutes implications in most different operational and knowledge areas, being relevant to the Public Administration, as a part of Corporative Governance and the public related, evolves high investments, such as financial, structure and material and human resources. The universities are entities from Indirect Administration and essential actors in the knowledge developing and creating and on its managers. Theirs public administrative agents, responds for the managing public resources competence and to provide internal policy that determines how IT will allow a bigger alignment and reaching of institutions business. We highlight the role of universities that manage significant quantity of public resources to achieve its institutional purposes. Looking this way, this theoretical and empirical study has as its goal to design an ITG panorama in the Brazilian universities (67 universities), for the strategic alignment on governance actions and institutional development focusing on the efficiency of the public service offered by those institutions. Facing this research focus delimitation, the methodology process will evolve three investigative activities: (1) documental and bibliographical research, (2) questioning, and exploratory tool, to investigate the IT Governance and Management perception in the IFES, directed to IT executive responsible, as a data collection device and (3) research the availability of ITG information in institutes websites. This project contributes to the studies this subject; it investigates the relations that make the ITG as a business strategy and shows the implementation IT Governance, such as a tool to allow the viability of Corporate Governance. This way, expected to contributes to the Public Administration development, following the principle that to improve it’s needed diagnose, and then, offer better results to the society on this field of working.
Resumo:
Reports of hydrilla (Hydrilla verticilata) infestation lakes Bisina and Opeta were verbally communicated by some members of FIRRI who undertook surveys during the LVEMP 1 phase (1997 to 2004) to assess the diversity and stocks of fishes in the Kyoga basin satellite lakes. This issue was taken up by FIRRI and NAARI staff who work on aquatic weeds management to ascertain and quantify the presence of H. verticilata and other aquatic weeds, with the sole aim of finding ways and means of controlling one of the world's worst aquatic weeds, H. verticilata.The survey on Lake Opeta indicated that this weed was rare since only a few small broken pieces were sited at the lake's outflow through an extensive wetland to Lake Bisina. It was therefore concluded that it was not economically viable to allocate resources for further survey of H. verticilata on Lake Opeta. This finding therefore discredited the previous (informal) reports that H. verticilata was well established on Lake Opeta. It should be noted that the reports came from scientists who were not well versed with systematics of aquatic plants.