948 resultados para Modern State
Resumo:
Research on Enterprise Resource Planning (ERP) Systems is becoming a well-established research theme in Information Systems (IS) research. Enterprise Resource Planning Systems, given its unique differentiations with other IS applications, have provided an interesting backdrop to test and re-test some of the key and fundamental concepts in IS. While some researchers have tested well-established concepts of technology acceptance, system usage and system success in the context of ERP Systems, others have researched how new paradigms like cloud computing and social media integrate with ERP Systems. Moreover, ERP Systems provided the context for cross disciplinary research such as knowledge management, project management and business process management research. Almost after two-decades since its inception in IS research, this paper provides a critique of 198 papers published on ERP Systems since 2006-2012. We observe patterns on ES research, provide comparisons to past studies and provide future research directions.
Resumo:
Food in schools is typically understood from a biomedical perspective. At practical, ideational and material levels, whether addressed pedagogically or bureaucratically, food in schools is generally considered from a natural sciences perspective. This perspective manifests as the bioenergetic principle of energy in versus energy out and appears in policy focused on issues such as obesity and physical activity. Despite the considerable literature on the sociology of food and eating, little is understood about food in schools from a sociological perspective. This oversight of one of the most fundamental requirements of the human condition--namely, food--should be of concern for educators. Investigating food through a political economy lens means understanding food in schools as part of broader economic, political, social and cultural conditions. Hence, a political economy of food and schooling is concerned with the formation of ideas about food relative to political, economic, and cultural ideologies in social practice. From a critical sociology study of food messages students receive in the primary school curriculum, this paper reports on some of the official food messages of an Australian state's education policy, as a case to highlight the current political economy of food in Australia. It examines the role of the corporate food industry in the formation of Australian food policy and how that policy created artefacts infused with competing messages. The paper highlights how food and nutrition policy moved from solely a health concern to incorporate an economic dimension and links that shift with the quality of food available in Queensland schools.
Resumo:
A satellite based observation system can continuously or repeatedly generate a user state vector time series that may contain useful information. One typical example is the collection of International GNSS Services (IGS) station daily and weekly combined solutions. Another example is the epoch-by-epoch kinematic position time series of a receiver derived by a GPS real time kinematic (RTK) technique. Although some multivariate analysis techniques have been adopted to assess the noise characteristics of multivariate state time series, statistic testings are limited to univariate time series. After review of frequently used hypotheses test statistics in univariate analysis of GNSS state time series, the paper presents a number of T-squared multivariate analysis statistics for use in the analysis of multivariate GNSS state time series. These T-squared test statistics have taken the correlation between coordinate components into account, which is neglected in univariate analysis. Numerical analysis was conducted with the multi-year time series of an IGS station to schematically demonstrate the results from the multivariate hypothesis testing in comparison with the univariate hypothesis testing results. The results have demonstrated that, in general, the testing for multivariate mean shifts and outliers tends to reject less data samples than the testing for univariate mean shifts and outliers under the same confidence level. It is noted that neither univariate nor multivariate data analysis methods are intended to replace physical analysis. Instead, these should be treated as complementary statistical methods for a prior or posteriori investigations. Physical analysis is necessary subsequently to refine and interpret the results.
Resumo:
In Deppro Pty Ltd v Hannah [2008] QSC 193 one of the matters considered by the court related to the requirement in r 243 of the Uniform Civil Procedure Rules 1999 (Qld) that a notice of non-party disclosure must “state the allegation in issue in the pleadings about which the document sought is directly relevant.”The approach adopted by the issuing party in this case of asserting that documents sought by a notice of non-party disclosure are relevant to allegations in numbered paragraphs in pleadings, and serving copies of the pleadings with the notice, is not uncommon in practice. This decision makes it clear that this practice is fraught with danger. In circumstances where it is not apparent that the non-party has been fully apprised of the relevant issues the decision suggests an applicant for non-party disclosure who has not complied with the requirements of s 243 might be required to issue a fresh, fully compliant notice, and to suffer associated costs consequences.
Resumo:
Existing distinctions among macro and micro approaches have been jeopardising the advances of Information Systems (IS) research. Both approaches have been criticized for explaining one level while neglecting the other; thereby, the current situation necessitates the application of multilevel research for revealing the deficiencies. Instead of studying single level (macro or micro), multilevel research entails more than one level of conceptualization and analysis, simultaneously. As the notion of multilevel is borrowed from reference disciplines, there tends to be confusions and inconsistencies within the IS discipline, which hinders the adoption of multilevel research. This paper speaks for the potential value of multilevel research, by investigating the current application status of multilevel research within the IS domain. A content analysis of multilevel research articles from major IS conferences and journals is presented. Analysis results suggest that IS scholars have applied multilevel research to produce high quality work ranging from a variety of topics. However, researchers have not yet been consistently defining “multilevel”, leading to idiosyncratic meanings of multilevel research, most often, in authors’ own interpretations. We argue that a rigorous definition of “multilevel research” needs to be explicated for consistencies in research community.
Resumo:
The purpose of this study was to determine factors (internal and external) that influenced Canadian provincial (state) politicians when making funding decisions about public libraries. Using the case study methodology, Canadian provincial/state level funding for public libraries in the 2009-10 fiscal year was examined. After reviewing funding levels across the country, three jurisdictions were chosen for the case: British Columbia's budget revealed dramatically decreased funding, Alberta's budget showed dramatically increased funding, and Ontario's budget was unchanged from the previous year. The primary source of data for the case was a series of semi-structured interviews with elected officials and senior bureaucrats from the three jurisdictions. An examination of primary and secondary documents was also undertaken to help set the political and economic context as well as to provide triangulation for the case interviews. The data were analysed to determine whether Cialdini's theory of influence (2001) and specifically any of the six tactics of influence (i.e, commitment and consistency, authority, liking, social proof, scarcity and reciprocity) were instrumental in these budget processes. Findings show the principles of "authority", "consistency and commitment" and "liking" were relevant, and that "liking" were especially important to these decisions. When these decision makers were considering funding for public libraries, they most often used three distinct lenses: the consistency lens (what are my values? what would my party do?), the authority lens (is someone with hierarchical power telling me to do this? are the requests legitimate?), and most importantly, the liking lens (how much do I like and know about the requester?). These findings are consistent with Cialdini's theory, which suggests the quality of some relationships is one of six factors that can most influence a decision maker. The small number of prior research studies exploring the reasons for increases or decreases in public library funding allocation decisions have given little insight into the factors that motivate those politicians involved in the process and the variables that contribute to these decisions. No prior studies have examined the construct of influence in decision making about funding for Canadian public libraries at any level of government. Additionally, no prior studies have examined the construct of influence in decision making within the context of Canadian provincial politics. While many public libraries are facing difficult decisions in the face of uncertain funding futures, the ability of the sector to obtain favourable responses to requests for increases may require a less simplistic approach than previously thought. The ability to create meaningful connections with individuals in many communities and across all levels of government should be emphasised as a key factor in influencing funding decisions.
Resumo:
China's market-oriented labor market reform has been in place for about one and a half decades. This study uses individual data for 1981 and 1987 to examine the success of the first half of the reform program. Success is evaluated by examining changes in the wage setting structure in the state-owned sector over the reform period. Have the market reforms stimulated worker incentives by increasing the returns to human capital acquisition? Has the wage structure altered to more closely mimic that of a market economy? In 1987, there is evidence of a structural change in the system of wage determination, with slightly increased rates of return to human capital. However, changes in industrial wage differentials appear to play the dominant role. It is argued that this may be due to labor market reforms, in particular the introduction of the profit related bonus scheme.J. Comp. Econom.,December 1997,25(3), pp. 403–421. Australian National University, Canberra, ACT0200, Australia and University of Tasmania, Hobart, Tasmania, Australia, and University of Aberdeen, Old Aberdeen, Scotland AB24 3QY.
Resumo:
In condition-based maintenance (CBM), effective diagnostic and prognostic tools are essential for maintenance engineers to identify imminent fault and predict the remaining useful life before the components finally fail. This enables remedial actions to be taken in advance and reschedule of production if necessary. All machine components are subjected to degradation processes in real environments and they have certain failure characteristics which can be related to the operating conditions. This paper describes a technique for accurate assessment of the remnant life of bearings based on health state probability estimation and historical knowledge embedded in the closed loop diagnostics and prognostics system. The technique uses the Support Vector Machine (SVM) classifier as a tool for estimating health state probability of machine degradation process to provide long term prediction. To validate the feasibility of the proposed model, real life fault historical data from bearings of High Pressure-Liquefied Natural Gas (HP-LNG) pumps were analysed and used to obtain the optimal prediction of remaining useful life (RUL). The results obtained were very encouraging and showed that the proposed prognosis system based on health state probability estimation has the potential to be used as an estimation tool for remnant life prediction in industrial machinery.
Resumo:
Controlled drug delivery is a key topic in modern pharmacotherapy, where controlled drug delivery devices are required to prolong the period of release, maintain a constant release rate, or release the drug with a predetermined release profile. In the pharmaceutical industry, the development process of a controlled drug delivery device may be facilitated enormously by the mathematical modelling of drug release mechanisms, directly decreasing the number of necessary experiments. Such mathematical modelling is difficult because several mechanisms are involved during the drug release process. The main drug release mechanisms of a controlled release device are based on the device’s physiochemical properties, and include diffusion, swelling and erosion. In this thesis, four controlled drug delivery models are investigated. These four models selectively involve the solvent penetration into the polymeric device, the swelling of the polymer, the polymer erosion and the drug diffusion out of the device but all share two common key features. The first is that the solvent penetration into the polymer causes the transition of the polymer from a glassy state into a rubbery state. The interface between the two states of the polymer is modelled as a moving boundary and the speed of this interface is governed by a kinetic law. The second feature is that drug diffusion only happens in the rubbery region of the polymer, with a nonlinear diffusion coefficient which is dependent on the concentration of solvent. These models are analysed by using both formal asymptotics and numerical computation, where front-fixing methods and the method of lines with finite difference approximations are used to solve these models numerically. This numerical scheme is conservative, accurate and easily implemented to the moving boundary problems and is thoroughly explained in Section 3.2. From the small time asymptotic analysis in Sections 5.3.1, 6.3.1 and 7.2.1, these models exhibit the non-Fickian behaviour referred to as Case II diffusion, and an initial constant rate of drug release which is appealing to the pharmaceutical industry because this indicates zeroorder release. The numerical results of the models qualitatively confirms the experimental behaviour identified in the literature. The knowledge obtained from investigating these models can help to develop more complex multi-layered drug delivery devices in order to achieve sophisticated drug release profiles. A multi-layer matrix tablet, which consists of a number of polymer layers designed to provide sustainable and constant drug release or bimodal drug release, is also discussed in this research. The moving boundary problem describing the solvent penetration into the polymer also arises in melting and freezing problems which have been modelled as the classical onephase Stefan problem. The classical one-phase Stefan problem has unrealistic singularities existed in the problem at the complete melting time. Hence we investigate the effect of including the kinetic undercooling to the melting problem and this problem is called the one-phase Stefan problem with kinetic undercooling. Interestingly we discover the unrealistic singularities existed in the classical one-phase Stefan problem at the complete melting time are regularised and also find out the small time behaviour of the one-phase Stefan problem with kinetic undercooling is different to the classical one-phase Stefan problem from the small time asymptotic analysis in Section 3.3. In the case of melting very small particles, it is known that surface tension effects are important. The effect of including the surface tension to the melting problem for nanoparticles (no kinetic undercooling) has been investigated in the past, however the one-phase Stefan problem with surface tension exhibits finite-time blow-up. Therefore we investigate the effect of including both the surface tension and kinetic undercooling to the melting problem for nanoparticles and find out the the solution continues to exist until complete melting. The investigation of including kinetic undercooling and surface tension to the melting problems reveals more insight into the regularisations of unphysical singularities in the classical one-phase Stefan problem. This investigation gives a better understanding of melting a particle, and contributes to the current body of knowledge related to melting and freezing due to heat conduction.
Resumo:
Restoring old buildings to conform the current building policies and standards is a great challenge to engineers and architects. The restoration of the Brisbane City Hall, a heritage building listed by the State of Queensland in Australia, developed an innovative approach to upgrade the building using the method called ‘concrete overlay’ following the guidelines of both the International Council on Monuments and Sites and the Burra Charter of Australia. Concrete overlay is a new method of structural strengthening by drilling new reinforcement and placing new concrete on top of the existing structure, akin to a bone transplant or bone grafting in the case of a human being. This method is popularly used for newer bridges which have suffered load stresses. However, this method had never been used on any heritage buildings which were built on different conditions and standards. The compatibility of this method is currently being monitored. Most of the modern historic buildings are rapidly deteriorating and require immediate interventions in order to be saved. As most of these heritage buildings are on the stage of advanced deterioration, significant attempts are being made and several innovations are being applied to upgrade these structures to conform with the current building requirements. To date, the knowledge and literature in regarding ‘concrete cancer’ in relation to rehabilitating these reinforced concrete heritage structures is significantly lacking. It is hoped that the method of concrete overlay and the case study of Brisbane City Hall restoration will contribute to the development of restoration techniques and policies for Modern Heritage Buildings.
Resumo:
The optimisation study of the fabrication of a compact TiO2 blocking layer (via Spray Pyrolysis Deposition) for poly (3-hexylthiopene) (P3HT) for Solid State Dye Sensitized Solar Cells (SDSCs) is reported. We used a novel spray TiO2 precursor solution composition obtained by adding acetylacetone to a conventional formulation (Diisopropoxytitanium bis (acetylacetonate) in ethanol). By Scanning Electron Microscopy a TiO2 layer with compact morphology and thickness of around 100 nmis shown. Through a Tafel plot analysis an enhancement of the device diode-like behaviour induced by the acetylacetone blocking layer respect to the conventional one is observed. Significantly, the device fabricatedwith the acetylacetone blocking layer shows an overall increment of the cell performance with respect to the cellwith the conventional one (DJsc/Jsc = +13.8%, DFF/FF = +39.7%, DPCE/PCE = +55.6%). A conversion efficiency optimumis found for 15 successive spray cycles where the diode-like behaviour of the acetylacetone blocking layer is more effective. Over three batches of cells (fabricated with P3HT and dye D35) an average conversion efficiency value of 3.9% (under a class A sun simulator with 1 sun A.M. 1.5 illumination conditions) was measured. From the best cell we fabricated a conversion efficiency value of 4.5% was extracted. This represents a significant increment with respect to previously reported values for P3HT/dye D35 based SDSCs.
Resumo:
This paper examines patterns of political activity and campaigning on Twitter in the context of the 2012 election in the Australian state of Queensland. Social media have been a visible component of political campaigning in Australia at least since the 2007 federal election, with Twitter, in particular, rising to greater prominence in the 2010 federal election. At state level, however, they have remained comparatively less important thus far. In this paper, we track uses of Twitter in the Queensland campaign from its unofficial start in February through to the election day of 24 March 2012. We both examine the overall patterns of activity in the hash tag #qldvotes, and track specific interactions between politicians and other users by following some 80 Twitter accounts of sitting members of parliament and alternative candidates. Such analysis provides new insights into the different approaches to social media campaigning which were embraced by specific candidates and party organisations, as well as an indication of the relative importance of social media activities, at present, for state-level election campaigns.
Resumo:
The most powerful known primitive in public-key cryptography is undoubtedly elliptic curve pairings. Upon their introduction just over ten years ago the computation of pairings was far too slow for them to be considered a practical option. This resulted in a vast amount of research from many mathematicians and computer scientists around the globe aiming to improve this computation speed. From the use of modern results in algebraic and arithmetic geometry to the application of foundational number theory that dates back to the days of Gauss and Euler, cryptographic pairings have since experienced a great deal of improvement. As a result, what was an extremely expensive computation that took several minutes is now a high-speed operation that takes less than a millisecond. This thesis presents a range of optimisations to the state-of-the-art in cryptographic pairing computation. Both through extending prior techniques, and introducing several novel ideas of our own, our work has contributed to recordbreaking pairing implementations.
Resumo:
We report inelastic neutron scattering measurements of the neutron Compton profile, J(y), for Be and for D in polycrystalline ZrD2 over a range of momentum transfers, q between 27 and 178 °A−1. The measurements were performed using the inverse geometry spectrometer eVS which is situated at the UK pulsed spallation neutron source ISIS. We have investigated deviations from impulse approximation (IA) scattering which are generically referred to as final state effects (FSEs) using a method described by Sears. This method allows both the magnitude and the q dependence of the FSE to be studied. Analysis of the measured data was compared with analysis of numerical simulations based on the harmonic approximation and good agreement was found for both ZrD2 and Be. Finally we have shown how (∇2V), where V is the interatomic potential, can be extracted from the antisymmetric component of J(y).
Resumo:
In this paper we analyse the effects of highway traffic flow parameters like vehicle arrival rate and density on the performance of Amplify and Forward (AF) cooperative vehicular networks along a multi-lane highway under free flow state. We derive analytical expressions for connectivity performance and verify them with Monte-Carlo simulations. When AF cooperative relaying is employed together with Maximum Ratio Combining (MRC) at the receivers the average route error rate shows 10-20 fold improvement compared to direct communication. A 4-8 fold increase in maximum number of traversable hops can also be observed at different vehicle densities when AF cooperative communication is used to strengthen communication routes. However the theorical upper bound of maximum number of hops promises higher performance gains.