948 resultados para database,range queries,outsourced data,encrypted database,security,information security,cloud security


Relevância:

100.00% 100.00%

Publicador:

Resumo:

The concept of cloud computing services is appealing to the small and medium enterprises (SMEs), with the opportunity to acquire modern information technology resources as a utility and avoid costly capital investments in technology resources. However, the adoption of the cloud computing services presents significant challenges to the SMEs. The SMEs need to determine a path to adopting the cloud computing services that would ensure their sustainable presence in the cloud computing environment. Information about approaches to adopting the cloud computing services by the SMEs is fragmented. Through an interpretive design, we suggest that the SMEs need to have a strategic and incremental intent, understand their organizational structure, understand the external factors, consider the human resource capacity, and understand the value expectations from the cloud computing services to forge a successful path to adopting the cloud computing services. These factors would contribute to a model of cloud services for SMEs.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Long-term measurements of particle number size distribution (PNSD) produce a very large number of observations and their analysis requires an efficient approach in order to produce results in the least possible time and with maximum accuracy. Clustering techniques are a family of sophisticated methods which have been recently employed to analyse PNSD data, however, very little information is available comparing the performance of different clustering techniques on PNSD data. This study aims to apply several clustering techniques (i.e. K-means, PAM, CLARA and SOM) to PNSD data, in order to identify and apply the optimum technique to PNSD data measured at 25 sites across Brisbane, Australia. A new method, based on the Generalised Additive Model (GAM) with a basis of penalised B-splines, was proposed to parameterise the PNSD data and the temporal weight of each cluster was also estimated using the GAM. In addition, each cluster was associated with its possible source based on the results of this parameterisation, together with the characteristics of each cluster. The performances of four clustering techniques were compared using the Dunn index and Silhouette width validation values and the K-means technique was found to have the highest performance, with five clusters being the optimum. Therefore, five clusters were found within the data using the K-means technique. The diurnal occurrence of each cluster was used together with other air quality parameters, temporal trends and the physical properties of each cluster, in order to attribute each cluster to its source and origin. The five clusters were attributed to three major sources and origins, including regional background particles, photochemically induced nucleated particles and vehicle generated particles. Overall, clustering was found to be an effective technique for attributing each particle size spectra to its source and the GAM was suitable to parameterise the PNSD data. These two techniques can help researchers immensely in analysing PNSD data for characterisation and source apportionment purposes.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Invasive non-native plants have negatively impacted on biodiversity and ecosystem functions world-wide. Because of the large number of species, their wide distributions and varying degrees of impact, we need a more effective method for prioritizing control strategies for cost-effective investment across heterogeneous landscapes. Here, we develop a prioritization framework that synthesizes scientific data, elicits knowledge from experts and stakeholders to identify control strategies, and appraises the cost-effectiveness of strategies. Our objective was to identify the most cost-effective strategies for reducing the total area dominated by high-impact non-native plants in the Lake Eyre Basin (LEB). We use a case study of the ˜120 million ha Lake Eyre Basin that comprises some of the most distinctive Australian landscapes, including Uluru-Kata Tjuta National Park. More than 240 non-native plant species are recorded in the Lake Eyre Basin, with many predicted to spread, but there are insufficient resources to control all species. Lake Eyre Basin experts identified 12 strategies to control, contain or eradicate non-native species over the next 50 years. The total cost of the proposed Lake Eyre Basin strategies was estimated at AU$1·7 billion, an average of AU$34 million annually. Implementation of these strategies is estimated to reduce non-native plant dominance by 17 million ha – there would be a 32% reduction in the likely area dominated by non-native plants within 50 years if these strategies were implemented. The three most cost-effective strategies were controlling Parkinsonia aculeata, Ziziphus mauritiana and Prosopis spp. These three strategies combined were estimated to cost only 0·01% of total cost of all the strategies, but would provide 20% of the total benefits. Over 50 years, cost-effective spending of AU$2·3 million could eradicate all non-native plant species from the only threatened ecological community within the Lake Eyre Basin, the Great Artesian Basin discharge springs. Synthesis and applications. Our framework, based on a case study of the ˜120 million ha Lake Eyre Basin in Australia, provides a rationale for financially efficient investment in non-native plant management and reveals combinations of strategies that are optimal for different budgets. It also highlights knowledge gaps and incidental findings that could improve effective management of non-native plants, for example addressing the reliability of species distribution data and prevalence of information sharing across states and regions.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Selecting an appropriate working correlation structure is pertinent to clustered data analysis using generalized estimating equations (GEE) because an inappropriate choice will lead to inefficient parameter estimation. We investigate the well-known criterion of QIC for selecting a working correlation Structure. and have found that performance of the QIC is deteriorated by a term that is theoretically independent of the correlation structures but has to be estimated with an error. This leads LIS to propose a correlation information criterion (CIC) that substantially improves the QIC performance. Extensive simulation studies indicate that the CIC has remarkable improvement in selecting the correct correlation structures. We also illustrate our findings using a data set from the Madras Longitudinal Schizophrenia Study.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The surface properties of solid state pharmaceutics are of critical importance. Processing modifies the surfaces and effects surface roughness, which influences the performance of the final dosage form in many different levels. Surface roughness has an effect on, e.g., the properties of powders, tablet compression and tablet coating. The overall goal of this research was to understand the surface structures of pharmaceutical surfaces. In this context the specific purpose was to compare four different analysing techniques (optical microscopy, scanning electron microscopy, laser profilometry and atomic force microscopy) in various pharmaceutical applications where the surfaces have quite different roughness scale. This was done by comparing the image and roughness analysing techniques using powder compacts, coated tablets and crystal surfaces as model surfaces. It was found that optical microscopy was still a very efficient technique, as it yielded information that SEM and AFM imaging are not able to provide. Roughness measurements complemented the image data and gave quantitative information about height differences. AFM roughness data represents the roughness of only a small part of the surface and therefore needs other methods like laser profilometer are needed to provide a larger scale description of the surface. The new developed roughness analysing method visualised surface roughness by giving detailed roughness maps, which showed local variations in surface roughness values. The method was able to provide a picture of the surface heterogeneity and the scale of the roughness. In the coating study, the laser profilometer results showed that the increase in surface roughness was largest during the first 30 minutes of coating when the surface was not yet fully covered with coating. The SEM images and the dispersive X-ray analysis results showed that the surface was fully covered with coating within 15 to 30 minutes. The combination of the different measurement techniques made it possible to follow the change of surface roughness and development of polymer coating. The optical imaging techniques gave a good overview of processes affecting the whole crystal surface, but they lacked the resolution to see small nanometer scale processes. AFM was used to visualize the nanoscale effects of cleaving and reveal the full surface heterogeneity, which underlies the optical imaging. Ethanol washing changed small (nanoscale) structure to some extent, but the effect of ethanol washing on the larger scale was small. Water washing caused total reformation of the surface structure at all levels.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Climate matching software (CLIMEX) was used to prioritise areas to explore for biological control agents in the native range of cat's claw creeper Macfadyena unguis-cati (Bignoniaceae), and to prioritise areas to release the agents in the introduced ranges of the plant. The native distribution of cat's claw creeper was used to predict the potential range of climatically suitable habitats for cat's claw creeper in its introduced ranges. A Composite Match Index (CMI) of cat's claw creeper was determined with the 'Match Climates' function in order to match the ranges in Australia and South Africa where the plant is introduced with its native range in South and Central America. This information was used to determine which areas might yield climatically-adapted agents. Locations in northern Argentina had CMI values which best matched sites with cat's claw creeper infestations in Australia and South Africa. None of the sites from where three currently prioritised biological control agents for cat's claw creeper were collected had CMI values higher than 0.8. The analysis showed that central and eastern Argentina, south Brazil, Uruguay and parts of Bolivia and Paraguay should be prioritised for exploration for new biological control agents for cat's claw creeper to be used in Australia and South Africa.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Synthetic backcrossed-derived bread wheats (SBWs) from CIMMYT were grown in the north-west of Mexico (CIANO) and sites across Australia during 3 seasons. A different set of lines was evaluated each season, as new materials became available from the CIMMYT crop enhancement program. Previously, we have evaluated both the performance of genotypes across environments and the genotype x environment interaction (G x E). The objective of this study was to interpret the G x E for yield in terms of crop attributes measured at individual sites and to identify the potential environmental drivers of this interaction. Groups of SBWs with consistent yield performance were identified, often comprising closely related lines. However, contrasting performance was also relatively common among sister lines or between a recurrent parent and its SBWs. Early flowering was a common feature among lines with broad adaptation and/or high yield in the northern Australian wheatbelt, while yields in the southern region did not show any association with the maturity type. Lines with high yields in the southern and northern regions had cooler canopies during flowering and early grain filling. Among the SBWs with Australian genetic backgrounds, lines best adapted to CIANO were tall (>100 cm), with a slightly higher ground cover. These lines also displayed a higher concentration of water-soluble carbohydrates in the stem at flowering, which was negatively correlated with stem number per unit area when evaluated in southern Australia (Horsham). Possible reasons for these patterns are discussed. Selection for yield at CIANO did not specifically identify the lines best adapted to northern Australia, although they were not the most poorly adapted either. In addition, groups of lines with specific adaptation to the south would not have been selected by choosing the highest yielding lines at CIANO. These findings suggest that selection at CIMMYT for Australian environments may be improved by either trait based selection or yield data combined with trait information. Flowering date, canopy temperature around flowering, tiller density, and water-soluble carbohydrate concentration in the stem at flowering seem likely candidates.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The historical development of Finnish nursing textbooks from the late 1880s to 1967: the training of nurses in the Foucauldian perspective. This study aims, first, to analyse the historical development of Finnish nursing textbooks in the training of nurses and in nursing education: what Foucauldian power processes operate in the writing and publishing processes? What picture of nursing did early nursing books portray and who were the decision makers? Second, this study also aims to analyse the processes of power in nurse training processes. The time frame extends from the early stages of nurse training in the late 1880s to 1967. This present study is a part of textbook research and of the history of professional education in Finland. This study seeks to explain how, who or what contributed the power processes involved in the writing of nursing textbooks and through textbooks. Did someone use these books as a tool to influence nursing education? The third aim of this study is to define and analyse the purpose of nurse training. Michel Foucault´s concept of power served as an explanatory framework for this study. A very central part of power is the assembling of data, the supplying of information and messages, and the creation of discourses. When applied to the training of nurses, power dictates what information is taught in the training and contained in the books. Thus, the textbook holds an influential position as a power user in these processes. Other processes in which such power is exercised include school discipline and all other normalizing processes. One of most powerful ways of adapting is the hall of residence, where nursing pupils were required to live. Trained nurses desired to separate themselves from their untrained predecessors and from those with less training by wearing different uniforms and living in separate housing units. The state supported the registration of trained nurses by legislation. With this decision the state made it illegal to work as a nurse without an authorised education, and use these regulations to limit and confirm the professional knowledge and power of nurses. Nurses, physicians and government authorities used textbooks in nursing education as tools to achieve their own purposes and principles. With these books all three groups attempted to confirm their own professional power and knowledge while at the same time limit the power and expertise of others. Public authorities sought to unify the training of nurses and the basis of knowledge in all nursing schools in Finland with similar and obligatory textbooks. This standardisation started 20 years before the government unified nursing training in 1930. The textbooks also served as data assemblers in unifying nursing practices in Finnish hospitals, because the Medical Board required all training hospitals to attach the textbooks to units with nursing pupils. For the nurses, and especially for the associations of Finnish nurses, making and publishing their own textbooks for the training of nurses was a part of their professional projects. With these textbooks, the nursing elite and the teachers tended to prepare nursing pupils’ identities for nursing’s very special mission. From the 1960s, nursing was no longer understood as a mission, but as a normal vocation. Nurses and doctors disputed this view throughout the period studied, which was the optimal relationship between theory and practice in nursing textbooks and in nurse education. The discussion of medical knowledge in nursing textbooks took place in the 1930s and 1940s. Nurses were very confused about their own professional knowledge and expertise, which explains why they could not create a new nursing textbook despite the urgency. A brand new nursing textbook was published in 1967, about 30 years after the predecessor. Keyword: nurse, nurse training, nursing education, power, textbook, Michel Foucault

Relevância:

100.00% 100.00%

Publicador:

Resumo:

The objective of the study was to determine, through meta-analysis, the rate of confirmed false reports of sexual assault to police. The meta-analysis initially involved a search for relevant articles. The search revealed seven studies where researchers or their trained helpers evaluated reported sexual assault cases to determine the rate of confirmed false reports. The meta-analysis calculated an overall rate and tested for possible moderators of effect size. The meta-analytic rate of false reports of sexual assault was .052 (95% CIs .030, .089). The rates for the individual studies were heterogeneous, suggesting the possibility of moderators of rate. However, the four possible moderators examined, year of publication, whether the data set used had information in addition to police reports, whether the study was completed in the U.S. or elsewhere, and whether inter-rater reliabilities were reported, were all not significant. The meta-analysis of seven relevant studies shows that confirmed false allegations of sexual assault made to police occur at a significant rate. The total false reporting rate, including both confirmed and equivocal cases, would be greater than the 5 percent rate found here.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Point sources of wastewater pollution, including effluent from municipal sewage treatment plants and intensive livestock and processing industries, can contribute significantly to the degradation of receiving waters (Chambers et al. 1997; Productivity Commission 2004). This has led to increasingly stringent local wastewater discharge quotas (particularly regarding Nitrogen, Phosphorous and suspended solids), and many municipal authorities and industry managers are now faced with upgrading their existing treatment facilities in order to comply. However, with high construction, energy and maintenance expenses and increasing labour costs, traditional wastewater treatment systems are becoming an escalating financial burden for the communities and industries that operate them. This report was generated, in the first instance, for the Burdekin Shire Council to provide information on design aspects and parameters critical for developing duckweed-based wastewater treatment (DWT) in the Burdekin region. However, the information will be relevant to a range of wastewater sources throughout Queensland. This information has been collated from published literature and both overseas and local studies of pilot and full-scale DWT systems. This report also considers options to generate revenue from duckweed production (a significant feature of DWT), and provides specifications and component cost information (current at the time of publication) for a large-scale demonstration of an integrated DWT and fish production system.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Background: The Queensland East Coast Otter Trawl Fishery (ECOTF) for penaeid shrimp fishes within Australia's Great Barrier Reef World Heritage Area (GBRWHA). The past decade has seen the implementation of conservation and fisheries management strategies to reduce the impact of the ECOTF on the seabed and improve biodiversity conservation. New information from electronic vessel location monitoring systems (VMS) provides an opportunity to review the interactions between the ECOTF and spatial closures for biodiversity conservation. Methodology and Results: We used fishing metrics and spatial information on the distribution of closures and modelled VMS data in a geographical information system (GIS) to assess change in effort of the trawl fishery from 2001-2009 and to quantify the exposure of 70 reef, non-reef and deep water bioregions to trawl fishing. The number of trawlers and the number of days fished almost halved between 2001 and 2009 and new spatial closures introduced in 2004 reduced the area zoned available for trawl fishing by 33%. However, we found that there was only a relatively minor change in the spatial footprint of the fishery as a result of new spatial closures. Non-reef bioregions benefited the most from new spatial closures followed by deep and reef bioregions. Conclusions/Significance: Although the catch of non target species remains an issue of concern for fisheries management, the small spatial footprint of the ECOTF relative to the size of the GBRWHA means that the impact on benthic habitats is likely to be negligible. The decline in effort as a result of fishing industry structural adjustment, increasing variable costs and business decisions of fishers is likely to continue a trend to fish only in the most productive areas. This will provide protection for most benthic habitats without any further legislative or management intervention.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Snapper (Pagrus auratus) is widely distributed throughout subtropical and temperate southern oceans and forms a significant recreational and commercial fishery in Queensland, Australia. Using data from government reports, media sources, popular publications and a government fisheries survey carried out in 1910, we compiled information on individual snapper fishing trips that took place prior to the commencement of fisherywide organized data collection, from 1871 to 1939. In addition to extracting all available quantitative data, we translated qualitative information into bounded estimates and used multiple imputation to handle missing values, forming 287 records for which catch rate (snapper fisher−1 h−1) could be derived. Uncertainty was handled through a parametric maximum likelihood framework (a transformed trivariate Gaussian), which facilitated statistical comparisons between data sources. No statistically significant differences in catch rates were found among media sources and the government fisheries survey. Catch rates remained stable throughout the time series, averaging 3.75 snapper fisher−1 h−1 (95% confidence interval, 3.42–4.09) as the fishery expanded into new grounds. In comparison, a contemporary (1993–2002) south-east Queensland charter fishery produced an average catch rate of 0.4 snapper fisher−1 h−1 (95% confidence interval, 0.31–0.58). These data illustrate the productivity of a fishery during its earliest years of development and represent the earliest catch rate data globally for this species. By adopting a formalized approach to address issues common to many historical records – missing data, a lack of quantitative information and reporting bias – our analysis demonstrates the potential for historical narratives to contribute to contemporary fisheries management.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

Accounting information systems (AIS) capture and process accounting data and provide valuable information for decision-makers. However, in a rapidly changing environment, continual management of the AIS is necessary for organizations to optimise performance outcomes. We suggest that building a dynamic AIS capability enables accounting process and organizational performance. Using the dynamic capabilities framework (Teece 2007) we propose that a dynamic AIS capability can be developed through the synergy of three competencies: a flexible AIS, having a complementary business intelligence system and accounting professionals with IT technical competency. Using survey data, we find evidence of a positive association between a dynamic AIS capability, accounting process performance, and overall firm performance. The results suggest that developing a dynamic AIS resource can add value to an organization. This study provides guidance for organizations looking to leverage the performance outcomes of their AIS environment.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

This study reports an investigation of the ion exchange treatment of sodium chloride solutions in relation to use of resin technology for applications such as desalination of brackish water. In particular, a strong acid cation (SAC) resin (DOW Marathon C) was studied to determine its capacity for sodium uptake and to evaluate the fundamentals of the ion exchange process involved. Key questions to answer included: impact of resin identity; best models to simulate the kinetics and equilibrium exchange behaviour of sodium ions; difference between using linear least squares (LLS) and non-linear least squares (NLLS) methods for data interpretation; and, effect of changing the type of anion in solution which accompanied the sodium species. Kinetic studies suggested that the exchange process was best described by a pseudo first order rate expression based upon non-linear least squares analysis of the test data. Application of the Langmuir Vageler isotherm model was recommended as it allowed confirmation that experimental conditions were sufficient for maximum loading of sodium ions to occur. The Freundlich expression best fitted the equilibrium data when analysing the information by a NLLS approach. In contrast, LLS methods suggested that the Langmuir model was optimal for describing the equilibrium process. The Competitive Langmuir model which considered the stoichiometric nature of ion exchange process, estimated the maximum loading of sodium ions to be 64.7 g Na/kg resin. This latter value was comparable to sodium ion capacities for SAC resin published previously. Inherent discrepancies involved when using linearized versions of kinetic and isotherm equations were illustrated, and despite their widespread use, the value of this latter approach was questionable. The equilibrium behaviour of sodium ions form sodium fluoride solution revealed that the sodium ions were now more preferred by the resin compared to the situation with sodium chloride. The solution chemistry of hydrofluoric acid was suggested as promoting the affinity of the sodium ions to the resin.

Relevância:

100.00% 100.00%

Publicador:

Resumo:

In the beginning of the 1990s the legislation regarding the municipalities and the system of central government transfers were reformed in Finland. This resulted in a move from detailed governmental control to increased municipal autonomy. The purpose of this decentralization was to enable the municipalities to better adapt their administration and service supply to local needs. The aim of this study was to explore the effects of the increased municipal autonomy on the organization of services for people with intellectual disabilities. Did the increased autonomy cause the municipalities to alter their service supply and production and did the services become more adapted to local needs? The data consists of statistical information on service use and production, and also of background data such as demographics, economics and political elections on 452 municipalities in Finland from the years 1994 and 2000. The methods used are cluster analysis, discriminant analysis and factor analysis. The municipalities could be grouped in two categories: those which offered mainly one kind of residential services and others which had more varied mixes of services. The use of institutional care had decreased and municipalities which used institutional care as their primary form of service were mostly very small municipalities in 2000. The situation had changed from 1994, when institutional care was the primary service for municipalities of all sizes. Also the service production had become more differentiated and the municipalities had started using more varied ways of production. More municipalities had started producing their own services and private production had increased as well. Furthermore, the increase in local autonomy had opened up possibilities for local politics to influence both the service selection and methods of production. The most significant motive for changes in the service structure was high unemployment and an increasing share of elderly people in the population, particularly in sparsely populated areas. Municipalities with a low level of resources had made more changes in their service organization while those with more resources had been able to carry on as before. Key words: service structure, service for people with intellectual disabilities, municipalities, contingency theory, New Public Management