937 resultados para Information Seeking.
Resumo:
Two sources of uncertainty in the X ray computed tomography imaging of polymer gel dosimeters are investigated in the paper.The first cause is a change in postirradiation density, which is proportional to the computed tomography signal and is associated with a volume change. The second cause of uncertainty is reconstruction noise.A simple technique that increases the residual signal to noise ratio by almost two orders of magnitude is examined.
Resumo:
IT resources are indispensable in the management of Public Sector Organizations (PSOs) around the world. We investigate the factors that could leverage the IT resources in PSOs in developing economies. While research on ways to leverage IT resources in private sector organizations of developed countries is substantial, our understanding on ways to leverage the IT resources in the public sector in developing countries is limited. The current study aspires to address this gap in the literature by seeking to determine the key factors required to create process value from public sector IT investments in developing countries. We draw on the resource-centric theories to imply the nature of factors that could leverage the IT resources in the public sector. Employing an interpretive design, we identified three factors necessary for IT process value generation in the public sector. We discuss these factors and state their implications to theory and practice.
Resumo:
The Business Process Management domain has evolved at a dramatic pace over the past two decades and the notion of the business process has become a ubiquitous part of the modern business enterprise. Most organizations now view their operations in terms of business processes and manage these business processes in the same way as other corporate assets. In recent years, an increasingly broad range of generic technology has become available for automating business processes. This is part of a growing trend in the software engineering field throughout the past 40 years, where aspects of functionality that are potentially reusable on a widespread basis have coalesced into generic software components. Figure 2.1 illustrates this trend and shows how software systems have evolved from the monolithic applications of the 1960s developed in their entirety often by a single development team to today’s offerings that are based on the integration of a range of generic technologies with only a small component of the application actually being developed from scratch. In the 1990s, generic functionality for the automation of business processes first became commercially available in the form of workflow technology and subsequently evolved in the broader field of business process management systems (BPMS). This technology alleviated the necessity to develop process support within applications from scratch and provided a variety of off-the-shelf options on which these requirements could be based. The demand for this technology was significant and it is estimated that by 2000 there were well over 200 distinct workflow offerings in the market, each with a distinct conceptual foundation. Anticipating the difficulties that would be experienced by organizations seeking to utilize and integrate distinct workflow offerings, the Workflow Management Coalition (WfMC), an industry group formed to advance technology in this area, proposed a standard reference model for workflow technology with an express desire to seek a common platform for achieving workflow interoperation.
Resumo:
This study explores people's risk taking behaviour after having suffered large real-world losses following a natural disaster. Using the margins of the 2011 Australian floods (Brisbane) as a natural experimental setting, we find that homeowners who were victims of the floods and face large losses in property values are 50% more likely to opt for a risky gamble -- a scratch card giving a small chance of a large gain ($500,000) -- than for a sure amount of comparable value ($10). This finding is consistent with prospect theory predictions regarding the adoption of a risk-seeking attitude after a loss.
Resumo:
Moving cell fronts are an essential feature of wound healing, development and disease. The rate at which a cell front moves is driven, in part, by the cell motility, quantified in terms of the cell diffusivity $D$, and the cell proliferation rate �$\lambda$. Scratch assays are a commonly-reported procedure used to investigate the motion of cell fronts where an initial cell monolayer is scratched and the motion of the front is monitored over a short period of time, often less than 24 hours. The simplest way of quantifying a scratch assay is to monitor the progression of the leading edge. Leading edge data is very convenient since, unlike other methods, it is nondestructive and does not require labeling, tracking or counting individual cells amongst the population. In this work we study short time leading edge data in a scratch assay using a discrete mathematical model and automated image analysis with the aim of investigating whether such data allows us to reliably identify $D$ and $\lambda$�. Using a naıve calibration approach where we simply scan the relevant region of the ($D$;$\lambda$�) parameter space, we show that there are many choices of $D$ and $\lambda$� for which our model produces indistinguishable short time leading edge data. Therefore, without due care, it is impossible to estimate $D$ and $\lambda$� from this kind of data. To address this, we present a modified approach accounting for the fact that cell motility occurs over a much shorter time scale than proliferation. Using this information we divide the duration of the experiment into two periods, and we estimate $D$ using data from the first period, while we estimate �$\lambda$ using data from the second period. We confirm the accuracy of our approach using in silico data and a new set of in vitro data, which shows that our method recovers estimates of $D$ and $\lamdba$� that are consistent with previously-reported values except that that our approach is fast, inexpensive, nondestructive and avoids the need for cell labeling and cell counting.
Resumo:
We address the problem of finite horizon optimal control of discrete-time linear systems with input constraints and uncertainty. The uncertainty for the problem analysed is related to incomplete state information (output feedback) and stochastic disturbances. We analyse the complexities associated with finding optimal solutions. We also consider two suboptimal strategies that could be employed for larger optimization horizons.
Resumo:
In this paper, we explore how BIM functionalities together with novel management concepts and methods have been utilized in thirteen hospital projects in the United States and the United Kingdom. Secondary data collection and analysis were used as the method. Initial findings indicate that the utilization of BIM enables a holistic view of project delivery and helps to integrate project parties into a collaborative process. The initiative to implement BIM must come from the top down to enable early involvement of all key stakeholders. It seems that it is rather resistance from people to adapt to the new way of working and thinking than immaturity of technology that hinders the utilization of BIM.
Resumo:
Lean construction and building information modeling (BIM) are quite different initiatives, but both are having profound impacts on the construction industry. A rigorous analysis of the myriad specific interactions between them indicates that a synergy exists which, if properly understood in theoretical terms, can be exploited to improve construction processes beyond the degree to which it might be improved by application of either of these paradigms independently. Using a matrix that juxtaposes BIM functionalities with prescriptive lean construction principles, 56 interactions have been identified, all but four of which represent constructive interaction. Although evidence for the majority of these has been found, the matrix is not considered complete but rather a framework for research to explore the degree of validity of the interactions. Construction executives, managers, designers, and developers of information technology systems for construction can also benefit from the framework as an aid to recognizing the potential synergies when planning their lean and BIM adoption strategies.
Resumo:
This paper investigates the mutual relations of three current drivers of construction: lean construction, building information modelling and sustainability. These drivers are based on infrequently occurring changes, only incidentally simultaneous, in their respective domains. It is contended that the drivers are mutually supportive and thus synergistic. They are aligned in the sense that all require, promote or enable collaboration. It is argued that these three drivers should be implemented in a unified manner for rapid and robust improvements in construction industry performance and the quality of the constructed facilities and their benefits for stakeholders and wider society.
Resumo:
Building with Building Information Modelling (BIM) changes design and production processes. But can BIM be used to support process changes designed according to lean production and lean construction principles? To begin to answer this question we provide a conceptual analysis of the interaction of lean construction and BIM for improving construction. This was investigated by compiling a detailed listing of lean construction principles and BIM functionalities which are relevant from this perspective. These were drawn from a detailed literature survey. A research framework for analysis of the interaction between lean and BIM was then compiled. The goal of the framework is to both guide and stimulate research; as such, the approach adopted up to this point is constructive. Ongoing research has identified 55 such interactions, the majority of which show positive synergy between the two.
Resumo:
This thesis investigated the information literacy experiences of EFL (English as a Foreign Language) students in a higher education institution in the United Arab Emirates (UAE). Phenomenography was used to investigate how EFL students' 'used information to learn' (ie. information literacy). The study revealed that EFL students' experienced information literacy across four categories and had varying experiences of information and learning. The research also showed that EFL students' faced a number of challenges and barriers due to language that impacted on their experiences of reading, understanding, accessing and translating information.
Resumo:
This study explored early career academics' experiences in using information to learn while building their networks for professional development. A 'knowledge ecosystem' model was developed consisting of informal learning interactions such as relating to information to create knowledge and engaging in mutually supportive relationships. Findings from this study present an alternative interpretation of information use for learning that is focused on processes manifesting as human interactions with informing entities revolving around the contexts of reciprocal human relationships.
Resumo:
This research has analysed both reciprocity and feedback mechanisms in multi-antenna wireless systems. It has presented the basis of an effective CSI feedback mechanism that efficiently provides the transmitter with the minimum information to allow the accurate knowledge of a rapidly changing channel. The simulations have been conducted using MATLAB to measure the improvement when the channel is estimated at the receiver in a 2 X 2 multi-antenna system and compared to the case of perfect channel knowledge at the receiver.
Resumo:
We explored whether teams develop shared perceptions regarding the quantity and quality of information and the extent of participation in decision making provided in an environment of continuous change. In addition, we examined whether change climate strength moderated relationships between change climate level and team outcomes. We examined relationships among aggregated change information and change participation and aggregated team outcomes, including two role stressors (i.e., role ambiguity and role overload) and two indicators of well-being (i.e., quality of worklife and distress). Questionnaires were distributed in an Australian law enforcement agency and data were used from 178 teams. Structural equation modelling analyses, controlling for a marker variable, were conducted to examine the main effects of aggregated change information and aggregated change participation on aggregated team outcomes. Results provided support for a model that included method effects due to a marker variable. In this model, change information climate was significantly negatively associated with role ambiguity, role overload, and distress, and significantly positively associated with quality of worklife. Change participation climate was significantly positively associated with quality of worklife. Change climate strength did not moderate relationships among change climate level and team outcomes.