891 resultados para information source
Resumo:
PURPOSE: This study aimed to compare selectivity characteristics among institution characteristics to determine differences by institutional funding source (public vs. private) or research activity level (research vs. non-research). METHODS: This study included information provided by the Commission on Accreditation in Physical Therapy Education (CAPTE) and the Federation of State Boards of Physical Therapy. Data were extracted from all students who graduated in 2011 from accredited physical therapy programs in the United States. The public and private designations of the institutions were extracted directly from the classifications from the 'CAPTE annual accreditation report,' and high and low research activity was determined based on Carnegie classifications. The institutions were classified into four groups: public/research intensive, public/non-research intensive, private/research intensive, and private/non-research intensive. Descriptive and comparison analyses with post hoc testing were performed to determine whether there were statistically significant differences among the four groups. RESULTS: Although there were statistically significant baseline grade point average differences among the four categorized groups, there were no significant differences in licensure pass rates or for any of the selectivity variables of interest. CONCLUSION: Selectivity characteristics did not differ by institutional funding source (public vs. private) or research activity level (research vs. non-research). This suggests that the concerns about reduced selectivity among physiotherapy programs, specifically the types that are experiencing the largest proliferation, appear less warranted.
Resumo:
The Continuous Plankton Recorder (CPR) survey was conceived from the outset as a programme of applied research designed to assist the fishing industry. Its survival and continuing vigour after 70 years is a testament to its utility, which has been achieved in spite of great changes in our understanding of the marine environment and in our concerns over how to manage it. The CPR has been superseded in several respects by other technologies, such as acoustics and remote sensing, but it continues to provide unrivalled seasonal and geographic information about a wide range of zooplankton and phytoplankton taxa. The value of this coverage increases with time and provides the basis for placing recent observations into the context of long-term, large-scale variability and thus suggesting what the causes are likely to be. Information from the CPR is used extensively in judging environmental impacts and producing quality status reports (QSR); it has shown the distributions of fish stocks, which had not previously been exploited; it has pointed to the extent of ungrazed phytoplankton production in the North Atlantic, which was a vital element in establishing the importance of carbon sequestration by phytoplankton. The CPR continues to be the principal source of large-scale, long-term information about the plankton ecosystem of the North Atlantic. It has recently provided extensive information about the biodiversity of the plankton and about the distribution of introduced species. It serves as a valuable example for the design of future monitoring of the marine environment and it has been essential to the design and implementation of most North Atlantic plankton research.
Resumo:
It has been proposed that the field of appropriate technology (AT) - small-scale, energy efficient and low-cost solutions, can be of tremendous assistance in many of the sustainable development challenges, such as food and water security, health, shelter, education and work opportunities. Unfortunately, there has not yet been a significant uptake of AT by organizations, researchers, policy makers or the mainstream public working in the many areas of the development sector. Some of the biggest barriers to higher AT engagement include: 1) AT perceived as inferior or ‘poor persons technology’, 2) questions of technological robustness, design, fit and transferability, 3) funding, 4) institutional support, as well as 5) general barriers associated with tackling rural poverty. With the rise of information and communication technologies (ICTs) for online networking and knowledge sharing, the possibilities to tap into the collaborative open-access and open-source AT are growing, and so is the prospect for collective poverty reducing strategies, enhancement of entrepreneurship, communications, education and a diffusion of life-changing technologies. In short, the same collaborative philosophy employed in the success of open source software can be applied to hardware design of technologies to improve sustainable development efforts worldwide. To analyze current barriers to open source appropriate technology (OSAT) and explore opportunities to overcome such obstacles, a series of interviews with researchers and organizations working in the field of AT were conducted. The results of the interviews confirmed the majority of literature identified barriers, but also revealed that the most pressing problem for organizations and researchers currently working in the field of AT is the need for much better communication and collaboration to share the knowledge and resources and work in partnership. In addition, interviews showcased general receptiveness to the principles of collaborative innovation and open source on the ground level. A much greater focus on networking, collaboration, demand-led innovation, community participation, and the inclusion of educational institutions through student involvement can be of significant help to build the necessary knowledge base, networks and the critical mass exposure for the growth of appropriate technology.
Resumo:
In this letter, a standard postnonlinear blind source separation algorithm is proposed, based on the MISEP method, which is widely used in linear and nonlinear independent component analysis. To best suit a wide class of postnonlinear mixtures, we adapt the MISEP method to incorporate a priori information of the mixtures. In particular, a group of three-layered perceptrons and a linear network are used as the unmixing system to separate sources in the postnonlinear mixtures, and another group of three-layered perceptron is used as the auxiliary network. The learning algorithm for the unmixing system is then obtained by maximizing the output entropy of the auxiliary network. The proposed method is applied to postnonlinear blind source separation of both simulation signals and real speech signals, and the experimental results demonstrate its effectiveness and efficiency in comparison with existing methods.
Resumo:
We investigate the source of information advantage in inter-dealer FX trading using data on trades and counterparty identities. In liquid dollar exchange rates, information is concentrated among dealers that trade most frequently and specialize their activity in a particular rate. In cross-rates, traders that engage in triangular arbitrage are best informed. Better-informed traders are also located on larger trading floors. In cross-rates, the ability to forecast flows explains all of the advantage of the triangular arbitrageurs. In liquid dollar rates, specialist traders can forecast both order flow and the component of exchange rate changes that is uncorrelated with flow.
Resumo:
Keeping a record of operator experience remains a challenge to operation management and a major source of inefficiency in information management. The objective is to develop a framework that enables an explicit presentation of experience based on information use. A purposive sampling method is used to select four small and medium-sized enterprises as case studies. The unit of analysis is the production process in the machine shop. Data collection is by structured interview, observation and documentation. A comparative case analysis is applied. The findings suggest experience is an accumulation of tacit information feedback, which can be made explicit in information use interoperatability matrix. The matrix is conditioned upon information use typology, which is strategic in waste reduction. The limitations include difficulty of participant anonymity where the organisation nominates a participant. Areas for further research include application of the concepts to knowledge management and shop floor resource management.
Resumo:
A general approach to information correction and fusion for belief functions is proposed, where not only may the information items be irrelevant, but sources may lie as well. We introduce a new correction scheme, which takes into account uncertain metaknowledge on the source’s relevance and truthfulness and that generalizes Shafer’s discounting operation. We then show how to reinterpret all connectives of Boolean logic in terms of source behavior assumptions with respect to relevance and truthfulness. We are led to generalize the unnormalized Dempster’s rule to all Boolean connectives, while taking into account the uncertainties pertaining to assumptions concerning the behavior of sources. Eventually, we further extend this approach to an even more general setting, where source behavior assumptions do not have to be restricted to relevance and truthfulness.We also establish the commutativity property between correction and fusion processes, when the behaviors of the sources are independent.
Resumo:
Biodiversity may be seen as a scientific measure of the complexity of a biological system, implying an information basis. Complexity cannot be directly valued, so economists have tried to define the services it provides, though often just valuing the services of 'key' species. Here we provide a new definition of biodiversity as a measure of functional information, arguing that complexity embodies meaningful information as Gregory Bateson defined it. We argue that functional information content (FIC) is the potentially valuable component of total (algorithmic) information content (AIC), as it alone determines biological fitness and supports ecosystem services. Inspired by recent extensions to the Noah's Ark problem, we show how FIC/AIC can be calculated to measure the degree of substitutability within an ecological community. Establishing substitutability is an essential foundation for valuation. From it, we derive a way to rank whole communities by Indirect Use Value, through quantifying the relation between system complexity and the production rate of ecosystem services. Understanding biodiversity as information evidently serves as a practical interface between economics and ecological science. © 2012 Elsevier B.V.
Resumo:
We have determined the mitochondrial genotype of liver fluke present in Bison (Bison bonasus) from the herd maintained in the Bialowieza National Park in order to determine the origin of the infection. Our results demonstrated that the infrapopulations present in the bison were genetically diverse and were likely to have been derived from the population present in local cattle. From a consideration of the genetic structure of the liver fluke infrapopulations we conclude that the provision of hay at feeding stations may be implicated in the transmission of this parasite to the bison. This information may be of relevance to the successful management of the herd. © 2012 Elsevier B.V.
Resumo:
We consider a wireless relay network with one source, one relay and one destination, where communications between nodes are preformed over N orthogonal channels. This, for example, is the case when orthogonal frequency division multiplexing is employed for data communications. Since the power available at the source and relay is limited, we study optimal power allocation strategies at the source and relay in order to maximize the overall source-destination capacity. Depending on the availability of the channel state information at both the source and relay or only at the relay, power allocation is performed at both the source and relay or only at the relay. Considering different setups for the problem, various optimization problems are formulated and solved. Some properties of the optimal solution are also proved.
Resumo:
Although described almost a century ago, interest in ionic liquids has flourished in the last two decades, with significant advances in the understanding of their chemical, physical and biological property sets driving their widespread application across multiple and diverse research areas. Significant progress has been made through the contributions of numerous research groups detailing novel libraries of ionic liquids, often ‘task-specific’ designer solvents for application in areas as diverse as separation technology, catalysis and bioremediation. Basic antimicrobial screening has often been included as a surrogate indication of the environmental impact of these compounds widely regarded as ‘green’ solvents. Obviating the biological properties, specifically toxicity, of these compounds has obstructed their potential application as sophisticated designer biocides. A recent tangent in ionic liquids research now aims to harness tuneable biological properties of these compounds in the design of novel potent antimicrobials, recognising their unparalleled flexibility for chemical diversity in a severely depleted antimicrobial arsenal. This review concentrates primarily on the antimicrobial potential of ionic liquids and aims to consolidate contemporary microbiological background information, assessment protocols and future considerations necessary to advance the field in light of the urgent need for antimicrobial innovation.
Resumo:
The Magellanic Clouds are uniquely placed to study the stellar contribution to dust emission. Individual stars can be resolved in these systems even in the mid-infrared, and they are close enough to allow detection of infrared excess caused by dust. We have searched the Spitzer Space Telescope data archive for all Infrared Spectrograph (IRS) staring-mode observations of the Small Magellanic Cloud (SMC) and found that 209 Infrared Array Camera (IRAC) point sources within the footprint of the Surveying the Agents of Galaxy Evolution in the Small Magellanic Cloud (SAGE-SMC) Spitzer Legacy programme were targeted, within a total of 311 staring-mode observations. We classify these point sources using a decision tree method of object classification, based on infrared spectral features, continuum and spectral energy distribution shape, bolometric luminosity, cluster membership and variability information. We find 58 asymptotic giant branch (AGB) stars, 51 young stellar objects, 4 post-AGB objects, 22 red supergiants, 27 stars (of which 23 are dusty OB stars), 24 planetary nebulae (PNe), 10 Wolf-Rayet stars, 3 H II regions, 3 R Coronae Borealis stars, 1 Blue Supergiant and 6 other objects, including 2 foreground AGB stars. We use these classifications to evaluate the success of photometric classification methods reported in the literature.
Resumo:
Risks are an essential feature of future climate change impacts. We explore whether knowledge that climate change might be the source of increasing pine beetle impacts on public or private forests affects stated risk estimates of damage, elicited using the exchangeability method. We find that across subjects the difference between public and private forest status does not influence stated risks, but the group told that global warming is the cause of pine beetle damage has significantly higher risk perceptions than the group not given this information.
Resumo:
Apesar das recentes inovações tecnológicas, o setor dos transportes continua a exercer impactes significativos sobre a economia e o ambiente. Com efeito, o sucesso na redução das emissões neste setor tem sido inferior ao desejável. Isto deve-se a diferentes fatores como a dispersão urbana e a existência de diversos obstáculos à penetração no mercado de tecnologias mais limpas. Consequentemente, a estratégia “Europa 2020” evidencia a necessidade de melhorar a eficiência no uso das atuais infraestruturas rodoviárias. Neste contexto, surge como principal objetivo deste trabalho, a melhoria da compreensão de como uma escolha de rota adequada pode contribuir para a redução de emissões sob diferentes circunstâncias espaciais e temporais. Simultaneamente, pretende-se avaliar diferentes estratégias de gestão de tráfego, nomeadamente o seu potencial ao nível do desempenho e da eficiência energética e ambiental. A integração de métodos empíricos e analíticos para avaliação do impacto de diferentes estratégias de otimização de tráfego nas emissões de CO2 e de poluentes locais constitui uma das principais contribuições deste trabalho. Esta tese divide-se em duas componentes principais. A primeira, predominantemente empírica, baseou-se na utilização de veículos equipados com um dispositivo GPS data logger para recolha de dados de dinâmica de circulação necessários ao cálculo de emissões. Foram percorridos aproximadamente 13200 km em várias rotas com escalas e características distintas: área urbana (Aveiro), área metropolitana (Hampton Roads, VA) e um corredor interurbano (Porto-Aveiro). A segunda parte, predominantemente analítica, baseou-se na aplicação de uma plataforma integrada de simulação de tráfego e emissões. Com base nesta plataforma, foram desenvolvidas funções de desempenho associadas a vários segmentos das redes estudadas, que por sua vez foram aplicadas em modelos de alocação de tráfego. Os resultados de ambas as perspetivas demonstraram que o consumo de combustível e emissões podem ser significativamente minimizados através de escolhas apropriadas de rota e sistemas avançados de gestão de tráfego. Empiricamente demonstrou-se que a seleção de uma rota adequada pode contribuir para uma redução significativa de emissões. Foram identificadas reduções potenciais de emissões de CO2 até 25% e de poluentes locais até 60%. Através da aplicação de modelos de tráfego demonstrou-se que é possível reduzir significativamente os custos ambientais relacionados com o tráfego (até 30%), através da alteração da distribuição dos fluxos ao longo de um corredor com quatro rotas alternativas. Contudo, apesar dos resultados positivos relativamente ao potencial para a redução de emissões com base em seleções de rotas adequadas, foram identificadas algumas situações de compromisso e/ou condicionantes que devem ser consideradas em futuros sistemas de eco navegação. Entre essas condicionantes importa salientar que: i) a minimização de diferentes poluentes pode implicar diferentes estratégias de navegação, ii) a minimização da emissão de poluentes, frequentemente envolve a escolha de rotas urbanas (em áreas densamente povoadas), iii) para níveis mais elevados de penetração de dispositivos de eco-navegação, os impactos ambientais em todo o sistema podem ser maiores do que se os condutores fossem orientados por dispositivos tradicionais focados na minimização do tempo de viagem. Com este trabalho demonstrou-se que as estratégias de gestão de tráfego com o intuito da minimização das emissões de CO2 são compatíveis com a minimização do tempo de viagem. Por outro lado, a minimização de poluentes locais pode levar a um aumento considerável do tempo de viagem. No entanto, dada a tendência de redução nos fatores de emissão dos poluentes locais, é expectável que estes objetivos contraditórios tendam a ser minimizados a médio prazo. Afigura-se um elevado potencial de aplicação da metodologia desenvolvida, seja através da utilização de dispositivos móveis, sistemas de comunicação entre infraestruturas e veículos e outros sistemas avançados de gestão de tráfego.