963 resultados para Avian Survey Techniques
Resumo:
We introduce the Survey for Ionization in Neutral Gas Galaxies (SINGG), a census of star formation in H I selected galaxies. The survey consists of H alpha and R-band imaging of a sample of 468 galaxies selected from the H I Parkes All Sky Survey (HIPASS). The sample spans three decades in H I mass and is free of many of the biases that affect other star-forming galaxy samples. We present the criteria for sample selection, list the entire sample, discuss our observational techniques, and describe the data reduction and calibration methods. This paper focuses on 93 SINGG targets whose observations have been fully reduced and analyzed to date. The majority of these show a single emission line galaxy (ELG). We see multiple ELGs in 13 fields, with up to four ELGs in a single field. All of the targets in this sample are detected in H alpha, indicating that dormant (non-star-forming) galaxies with M-H I greater than or similar to 3x10(7) M-circle dot are very rare. A database of the measured global properties of the ELGs is presented. The ELG sample spans 4 orders of magnitude in luminosity (H alpha and R band), and H alpha surface brightness, nearly 3 orders of magnitude in R surface brightness and nearly 2 orders of magnitude in H alpha equivalent width (EW). The surface brightness distribution of our sample is broader than that of the Sloan Digital Sky Survey (SDSS) spectroscopic sample, the EW distribution is broader than prism-selected samples, and the morphologies found include all common types of star-forming galaxies (e.g., irregular, spiral, blue compact dwarf, starbursts, merging and colliding systems, and even residual star formation in S0 and Sa spirals). Thus, SINGG presents a superior census of star formation in the local universe suitable for further studies ranging from the analysis of H II regions to determination of the local cosmic star formation rate density.
Resumo:
Many variables that are of interest in social science research are nominal variables with two or more categories, such as employment status, occupation, political preference, or self-reported health status. With longitudinal survey data it is possible to analyse the transitions of individuals between different employment states or occupations (for example). In the statistical literature, models for analysing categorical dependent variables with repeated observations belong to the family of models known as generalized linear mixed models (GLMMs). The specific GLMM for a dependent variable with three or more categories is the multinomial logit random effects model. For these models, the marginal distribution of the response does not have a closed form solution and hence numerical integration must be used to obtain maximum likelihood estimates for the model parameters. Techniques for implementing the numerical integration are available but are computationally intensive requiring a large amount of computer processing time that increases with the number of clusters (or individuals) in the data and are not always readily accessible to the practitioner in standard software. For the purposes of analysing categorical response data from a longitudinal social survey, there is clearly a need to evaluate the existing procedures for estimating multinomial logit random effects model in terms of accuracy, efficiency and computing time. The computational time will have significant implications as to the preferred approach by researchers. In this paper we evaluate statistical software procedures that utilise adaptive Gaussian quadrature and MCMC methods, with specific application to modeling employment status of women using a GLMM, over three waves of the HILDA survey.
Resumo:
With the advent of globalisation companies all around the world must improve their performance in order to survive. The threats are coming from everywhere, and in different ways, such as low cost products, high quality products, new technologies, and new products. Different companies in different countries are using various techniques and using quality criteria items to strive for excellence. Continuous improvement techniques are used to enable companies to improve their operations. Therefore, companies are using techniques such as TQM, Kaizen, Six-Sigma, Lean Manufacturing, and quality award criteria items such as Customer Focus, Human Resources, Information & Analysis, and Process Management. The purpose of this paper is to compare the use of these techniques and criteria items in two countries, Mexico and the United Kingdom, which differ in culture and industrial structure. In terms of the use of continuous improvement tools and techniques, Mexico formally started to deal with continuous improvement by creating its National Quality Award soon after the Americans began the Malcolm Baldrige National Quality Award. The United Kingdom formally started by using the European Quality Award (EQA), modified and renamed as the EFQM Excellence Model. The methodology used in this study was to undertake a literature review of the subject matter and to study some general applications around the world. A questionnaire survey was then designed and a survey undertaken based on the same scale, about the same sample size, and the about the same industrial sector within the two countries. The survey presents a brief definition of each of the constructs to facilitate understanding of the questions. The analysis of the data was then conducted with the assistance of a statistical software package. The survey results indicate both similarities and differences in the strengths and weaknesses of the companies in the two countries. One outcome of the analysis is that it enables the companies to use the results to benchmark themselves and thus act to reinforce their strengths and to reduce their weaknesses.
Resumo:
Numerous techniques have been developed to control cost and time of construction projects. However, there is limited research on issues surrounding the practical usage of these techniques. To address this, a survey was conducted on the top 150 construction companies and 100 construction consultancies in the UK aimed at identifying common project control practices and factors inhibiting effective project control in practice. It found that despite the vast application of control techniques a high proportion of respondents still experienced cost and time overruns on a significant proportion of their projects. Analysis of the survey results concluded that more effort should be geared at the management of the identified top project control inhibiting factors. This paper has outlined some measures for mitigating these inhibiting factors so that the outcome of project time and cost control can be improved in practice.
Resumo:
This paper presents results of a study examining the methods used to select employees in 579 UK organizations representing a range of different organization sizes and industry sectors. Overall, a smaller proportion of organizations in this sample reported using formalized methods (e.g., assessment centres) than informal methods (e.g., unstructured interviews). The curriculum vitae (CVs) was the most commonly used selection method, followed by the traditional triad of application form, interviews, and references. Findings also indicated that the use of different selection methods was similar in both large organizations and small-to-medium-sized enterprises. Differences were found across industry sector with public and voluntary sectors being more likely to use formalized techniques (e.g., application forms rather than CVs and structured rather than unstructured interviews). The results are discussed in relation to their implications, both in terms of practice and future research.
Resumo:
Traditional Chinese Medicine (TCM) has been actively researched through various approaches, including computational techniques. A review on basic elements of TCM is provided to illuminate various challenges and progresses in its study using computational methods. Information on various TCM formulations, in particular resources on databases of TCM formulations and their integration to Western medicine, are analyzed in several facets, such as TCM classifications, types of databases, and mining tools. Aspects of computational TCM diagnosis, namely inspection, auscultation, pulse analysis as well as TCM expert systems are reviewed in term of their benefits and drawbacks. Various approaches on exploring relationships among TCM components and finding genes/proteins relating to TCM symptom complex are also studied. This survey provides a summary on the advance of computational approaches for TCM and will be useful for future knowledge discovery in this area. © 2007 Elsevier Ireland Ltd. All rights reserved.
Resumo:
This paper reports the results of a web-based study of the perceptions of accounting journals in Australasia. Journal ranking studies have generally adopted citation techniques or used academics’ perceptions as the basis for assessing journal quality. Our research contributes to the existing literature by conducting a survey of academics in Australasia using a web-based instrument. The analysis indicates that the perceptions of the so-called “elite” accounting journals have become unsettled. The research highlights the emergence of more recent, alternative paradigm journals (CPA and AAAJ) as both highly ranking.
Resumo:
SHARDS, an ESO/GTC Large Program, is an ultra-deep (26.5 mag) spectro-photometric survey with GTC/OSIRIS designed to select and study massive passively evolving galaxies at z=1.0-2.3 in the GOODS-N field using a set of 24 medium-band filters (FWHM ∼ 17 nm) covering the 500-950 nm spectral range. Our observing strategy has been planned to detect, for z>1 sources, the prominent Mg absorption feature (at rest-frame ∼ 280 nm), a distinctive, necessary, and sufficient feature of evolved stellar populations (older than 0.5 Gyr). These observations are being used to: (1) derive for the first time an unbiased sample of high-z quiescent galaxies, which extends to fainter magnitudes the samples selected with color techniques and spectroscopic surveys; (2) derive accurate ages and stellar masses based on robust measurements of spectral features such as the Mg_UV or D(4000) indices; (3) measure their redshift with an accuracy Δz/(1+z)<0.02; and (4) study emission-line galaxies (starbursts and AGN) up to very high redshifts. The well-sampled optical SEDs provided by SHARDS for all sources in the GOODS-N field are a valuable complement for current and future surveys carried out with other telescopes (e.g., Spitzer, HST, and Herschel).
Resumo:
The VLT-FLAMES Tarantula Survey (VFTS) has secured mid-resolution spectra of over 300 O-type stars in the 30 Doradus region of the Large Magellanic Cloud. A homogeneous analysis of such a large sample requires automated techniques, an approach that will also be needed for the upcoming analysis of the Gaia surveys of the Northern and Southern Hemisphere supplementing the Gaia measurements. We point out the importance of Gaia for the study of O stars, summarize the O star science case of VFTS and present a test of the automated modeling technique using synthetically generated data. This method employs a genetic algorithm based optimization technique in combination with fastwind model atmospheres. The method is found to be robust and able to recover the main photospheric parameters accurately. Precise wind parameters can be obtained as well, however, as expected, for dwarf stars the rate of acceleration of the ow is poorly constrained.
Resumo:
The application of custom classification techniques and posterior probability modeling (PPM) using Worldview-2 multispectral imagery to archaeological field survey is presented in this paper. Research is focused on the identification of Neolithic felsite stone tool workshops in the North Mavine region of the Shetland Islands in Northern Scotland. Sample data from known workshops surveyed using differential GPS are used alongside known non-sites to train a linear discriminant analysis (LDA) classifier based on a combination of datasets including Worldview-2 bands, band difference ratios (BDR) and topographical derivatives. Principal components analysis is further used to test and reduce dimensionality caused by redundant datasets. Probability models were generated by LDA using principal components and tested with sites identified through geological field survey. Testing shows the prospective ability of this technique and significance between 0.05 and 0.01, and gain statistics between 0.90 and 0.94, higher than those obtained using maximum likelihood and random forest classifiers. Results suggest that this approach is best suited to relatively homogenous site types, and performs better with correlated data sources. Finally, by combining posterior probability models and least-cost analysis, a survey least-cost efficacy model is generated showing the utility of such approaches to archaeological field survey.
Resumo:
This paper provides an overview of IDS types and how they work as well as configuration considerations and issues that affect them. Advanced methods of increasing the performance of an IDS are explored such as specification based IDS for protecting Supervisory Control And Data Acquisition (SCADA) and Cloud networks. Also by providing a review of varied studies ranging from issues in configuration and specific problems to custom techniques and cutting edge studies a reference can be provided to others interested in learning about and developing IDS solutions. Intrusion Detection is an area of much required study to provide solutions to satisfy evolving services and networks and systems that support them. This paper aims to be a reference for IDS technologies other researchers and developers interested in the field of intrusion detection.
Resumo:
F-123-R; issued June 1, 1997; two different reports were issued from the Center for Aquatic Ecology with report number 1997 (9)
Resumo:
Understanding how biodiversity spatially distribute over both the short term and long term, and what factors are affecting the distribution, are critical for modeling the spatial pattern of biodiversity as well as for promoting effective conservation planning and practices. This dissertation aims to examine factors that influence short-term and long-term avian distribution from the geographical sciences perspective. The research develops landscape level habitat metrics to characterize forest height heterogeneity and examines their efficacies in modelling avian richness at the continental scale. Two types of novel vegetation-height-structured habitat metrics are created based on second order texture algorithms and the concepts of patch-based habitat metrics. I correlate the height-structured metrics with the richness of different forest guilds, and also examine their efficacies in multivariate richness models. The results suggest that height heterogeneity, beyond canopy height alone, supplements habitat characterization and richness models of two forest bird guilds. The metrics and models derived in this study demonstrate practical examples of utilizing three-dimensional vegetation data for improved characterization of spatial patterns in species richness. The second and the third projects focus on analyzing centroids of avian distributions, and testing hypotheses regarding the direction and speed of these shifts. I first showcase the usefulness of centroids analysis for characterizing the distribution changes of a few case study species. Applying the centroid method on 57 permanent resident bird species, I show that multi-directional distribution shifts occurred in large number of studied species. I also demonstrate, plain birds are not shifting their distribution faster than mountain birds, contrary to the prediction based on climate change velocity hypothesis. By modelling the abundance change rate at regional level, I show that extreme climate events and precipitation measures associate closely with some of the long-term distribution shifts. This dissertation improves our understanding on bird habitat characterization for species richness modelling, and expands our knowledge on how avian populations shifted their ranges in North America responding to changing environments in the past four decades. The results provide an important scientific foundation for more accurate predictive species distribution modeling in future.
Resumo:
ID: 8528; Annual Report to Division of Fisheries, Illinois Department of Natural Resources; F-123-R-11