934 resultados para Computer Science(all)
Resumo:
The Business and Information Technologies (BIT) project strives to reveal new insights into how modern IT impacts organizational structures and business practices using empirical methods. Due to its international scope, it allows for inter-country comparison of empirical results. Germany — represented by the European School of Management and Technologies (ESMT) and the Institute of Information Systems at Humboldt-Universität zu Berlin — joined the BIT project in 2006. This report presents the result of the first survey conducted in Germany during November–December 2006. The key results are as follows: • The most widely adopted technologies and systems in Germany are websites, wireless hardware and software, groupware/productivity tools, and enterprise resource planning (ERP) systems. The biggest potential for growth exists for collaboration and portal tools, content management systems, business process modelling, and business intelligence applications. A number of technological solutions have not yet been adopted by many organizations but also bear some potential, in particular identity management solutions, Radio Frequency Identification (RFID), biometrics, and third-party authentication and verification. • IT security remains on the top of the agenda for most enterprises: budget spending was increasing in the last 3 years. • The workplace and work requirements are changing. IT is used to monitor employees' performance in Germany, but less heavily compared to the United States (Karmarkar and Mangal, 2007).1 The demand for IT skills is increasing at all corporate levels. Executives are asking for more and better structured information and this, in turn, triggers the appearance of new decision-making tools and online technologies on the market. • The internal organization of companies in Germany is underway: organizations are becoming flatter, even though the trend is not as pronounced as in the United States (Karmarkar and Mangal, 2007), and the geographical scope of their operations is increasing. Modern IT plays an important role in enabling this development, e.g. telecommuting, teleconferencing, and other web-based collaboration formats are becoming increasingly popular in the corporate context. • The degree to which outsourcing is being pursued is quite limited with little change expected. IT services, payroll, and market research are the most widely outsourced business functions. This corresponds to the results from other countries. • Up to now, the adoption of e-business technologies has had a rather limited effect on marketing functions. Companies tend to extract synergies from traditional printed media and on-line advertising. • The adoption of e-business has not had a major impact on marketing capabilities and strategy yet. Traditional methods of customer segmentation are still dominating. The corporate identity of most organizations does not change significantly when going online. • Online sales channel are mainly viewed as a complement to the traditional distribution means. • Technology adoption has caused production and organizational costs to decrease. However, the costs of technology acquisition and maintenance as well as consultancy and internal communication costs have increased.
Resumo:
In using online social networks to connect and interact with people has become extremely popular all around the world. Thelargest Social Networking Site (SNS), Facebook, offers its services in over 70 languages and increasingly relies oninternational users to grow its membership. Aiming to understand the role of culture in SNS participation, this study adopts a‘privacy calculus’ perspective to examine the differences in participation patterns between American and MoroccanFacebook users. Survey results show that Moroccans users disclose less on Facebook than US users, yet perceive moredamage should their privacy on Facebook be violated. American users, on the other hand, have lower privacy concerns, trustfellow SNS members and legal system more, and disclose more in their profile. From a practical standpoint, the resultsindicate that SNS providers cannot rely on the same methods to encourage user participation and disclosure in differentcountries.
Resumo:
Little is known about the course of recovery of acute low back pain (LBP) patients as a function of depression. In a prospective study, 286 acute LBP patients were assessed at baseline and followed up over 6 months. Recovery was defined as improvement in the Oswestry Disability Index (ODI). Repeated-measures analysis of covariance was employed with ODI as repeated factor, age, sex, and body mass index as covariates, depression and all other potential prognostic factors as between-subject factors. Of study participants, 18% were classified as depressive (>33 points on the Zung Self-Rating Depression Scale). Of 286 participants, 135 were lost to follow-up. In the longitudinal sample of 151 patients the course of recovery was slower in depressive patients. Depression was associated with LBP especially after 6 weeks and should therefore be included in screening instruments for acute LBP patients to identify those at risk of delayed recovery at an early stage.
Resumo:
OBJECTIVE: Occupational low back pain (LBP) is considered to be the most expensive form of work disability, with the socioeconomic costs of persistent LBP exceeding the costs of acute and subacute LBP by far. This makes the early identification of patients at risk of developing persistent LBP essential, especially in working populations. The aim of the study was to evaluate both risk factors (for the development of persistent LBP) and protective factors (preventing the development of persistent LBP) in the same cohort. PARTICIPANTS: An inception cohort of 315 patients with acute to subacute or with recurrent LBP was recruited from 14 health practitioners (twelve general practitioners and two physiotherapists) across New Zealand. METHODS: Patients with persistent LBP at six-month follow-up were compared to patients with non-persistent LBP looking at occupational, psychological, biomedical and demographic/lifestyle predictors at baseline using multiple logistic regression analyses. All significant variables from the different domains were combined into a one predictor model. RESULTS: A final two-predictor model with an overall predictive value of 78% included social support at work (OR 0.67; 95%CI 0.45 to 0.99) and somatization (OR 1.08; 95%CI 1.01 to 1.15). CONCLUSIONS: Social support at work should be considered as a resource preventing the development of persistent LBP whereas somatization should be considered as a risk factor for the development of persistent LBP. Further studies are needed to determine if addressing these factors in workplace interventions for patients suffering from acute, subacute or recurrent LBP prevents subsequent development of persistent LBP.
Resumo:
Opportunistic routing (OR) employs a list of candidates to improve wireless transmission reliability. However, conventional list-based OR restricts the freedom of opportunism, since only the listed nodes are allowed to compete for packet forwarding. Additionally, the list is generated statically based on a single network metric prior to data transmission, which is not appropriate for mobile ad-hoc networks (MANETs). In this paper, we propose a novel OR protocol - Context-aware Adaptive Opportunistic Routing (CAOR) for MANETs. CAOR abandons the idea of candidate list and it allows all qualified nodes to participate in packet transmission. CAOR forwards packets by simultaneously exploiting multiple cross-layer context information, such as link quality, geographic progress, energy, and mobility.With the help of the Analytic Hierarchy Process theory, CAOR adjusts the weights of context information based on their instantaneous values to adapt the protocol behavior at run-time. Moreover, CAOR uses an active suppression mechanism to reduce packet duplication. Simulation results show that CAOR can provide efficient routing in highly mobile environments. The adaptivity feature of CAOR is also validated.
Resumo:
Background: Patients presenting to the emergency department (ED) currently face inacceptable delays in initial treatment, and long, costly hospital stays due to suboptimal initial triage and site-of-care decisions. Accurate ED triage should focus not only on initial treatment priority, but also on prediction of medical risk and nursing needs to improve site-of-care decisions and to simplify early discharge management. Different triage scores have been proposed, such as the Manchester triage system (MTS). Yet, these scores focus only on treatment priority, have suboptimal performance and lack validation in the Swiss health care system. Because the MTS will be introduced into clinical routine at the Kantonsspital Aarau, we propose a large prospective cohort study to optimize initial patient triage. Specifically, the aim of this trial is to derive a three-part triage algorithm to better predict (a) treatment priority; (b) medical risk and thus need for in-hospital treatment; (c) post-acute care needs of patients at the most proximal time point of ED admission. Methods/design: Prospective, observational, multicenter, multi-national cohort study. We will include all consecutive medical patients seeking ED care into this observational registry. There will be no exclusions except for non-adult and non-medical patients. Vital signs will be recorded and left over blood samples will be stored for later batch analysis of blood markers. Upon ED admission, the post-acute care discharge score (PACD) will be recorded. Attending ED physicians will adjudicate triage priority based on all available results at the time of ED discharge to the medical ward. Patients will be reassessed daily during the hospital course for medical stability and readiness for discharge from the nurses and if involved social workers perspective. To assess outcomes, data from electronic medical records will be used and all patients will be contacted 30 days after hospital admission to assess vital and functional status, re-hospitalization, satisfaction with care and quality of life measures. We aim to include between 5000 and 7000 patients over one year of recruitment to derive the three-part triage algorithm. The respective main endpoints were defined as (a) initial triage priority (high vs. low priority) adjudicated by the attending ED physician at ED discharge, (b) adverse 30 day outcome (death or intensive care unit admission) within 30 days following ED admission to assess patients risk and thus need for in-hospital treatment and (c) post acute care needs after hospital discharge, defined as transfer of patients to a post-acute care institution, for early recognition and planning of post-acute care needs. Other outcomes are time to first physician contact, time to initiation of adequate medical therapy, time to social worker involvement, length of hospital stay, reasons fordischarge delays, patient’s satisfaction with care, overall hospital costs and patients care needs after returning home. Discussion: Using a reliable initial triage system for estimating initial treatment priority, need for in-hospital treatment and post-acute care needs is an innovative and persuasive approach for a more targeted and efficient management of medical patients in the ED. The proposed interdisciplinary , multi-national project has unprecedented potential to improve initial triage decisions and optimize resource allocation to the sickest patients from admission to discharge. The algorithms derived in this study will be compared in a later randomized controlled trial against a usual care control group in terms of resource use, length of hospital stay, overall costs and patient’s outcomes in terms of mortality, re-hospitalization, quality of life and satisfaction with care.
Resumo:
In this work, we propose a distributed rate allocation algorithm that minimizes the average decoding delay for multimedia clients in inter-session network coding systems. We consider a scenario where the users are organized in a mesh network and each user requests the content of one of the available sources. We propose a novel distributed algorithm where network users determine the coding operations and the packet rates to be requested from the parent nodes, such that the decoding delay is minimized for all clients. A rate allocation problem is solved by every user, which seeks the rates that minimize the average decoding delay for its children and for itself. Since this optimization problem is a priori non-convex, we introduce the concept of equivalent packet flows, which permits to estimate the expected number of packets that every user needs to collect for decoding. We then decompose our original rate allocation problem into a set of convex subproblems, which are eventually combined to obtain an effective approximate solution to the delay minimization problem. The results demonstrate that the proposed scheme eliminates the bottlenecks and reduces the decoding delay experienced by users with limited bandwidth resources. We validate the performance of our distributed rate allocation algorithm in different video streaming scenarios using the NS-3 network simulator. We show that our system is able to take benefit of inter-session network coding for simultaneous delivery of video sessions in networks with path diversity.
Resumo:
Content Distribution Networks are mandatory components of modern web architectures, with plenty of vendors offering their services. Despite its maturity, new paradigms and architecture models are still being developed in this area. Cloud Computing, on the other hand, is a more recent concept which has expanded extremely quickly, with new services being regularly added to cloud management software suites such as OpenStack. The main contribution of this paper is the architecture and the development of an open source CDN that can be provisioned in an on-demand, pay-as-you-go model thereby enabling the CDN as a Service paradigm. We describe our experience with integration of CDNaaS framework in a cloud environment, as a service for enterprise users. We emphasize the flexibility and elasticity of such a model, with each CDN instance being delivered on-demand and associated to personalized caching policies as well as an optimized choice of Points of Presence based on exact requirements of an enterprise customer. Our development is based on the framework developed in the Mobile Cloud Networking EU FP7 project, which offers its enterprise users a common framework to instantiate and control services. CDNaaS is one of the core support components in this project as is tasked to deliver different type of multimedia content to several thousands of users geographically distributed. It integrates seamlessly in the MCN service life-cycle and as such enjoys all benefits of a common design environment, allowing for an improved interoperability with the rest of the services within the MCN ecosystem.
Resumo:
We review our recent work on protein-ligand interactions in vitamin transporters of the Sec-14-like protein. Our studies focused on the cellular-retinaldehyde binding protein (CRALBP) and the alpha-tocopherol transfer protein (alpha-TTP). CRALBP is responsible for mobilisation and photo-protection of short-chain cis-retinoids in the dim-light visual cycle or rod photoreceptors. alpha-TTP is a key protein responsible for selection and retention of RRR-alpha-tocopherol, the most active isoform of vitamin E in superior animals. Our simulation studies evidence how subtle chemical variations in the substrate can lead to significant distortion in the structure of the complex, and how these changes can either lead to new protein function, or be used to model engineered protein variants with tailored properties. Finally, we show how integration of computational and experimental results can contribute in synergy to the understanding of fundamental processes at the biomolecular scale.
Relative Predicativity and dependent recursion in second-order set theory and higher-orders theories
Resumo:
This article reports that some robustness of the notions of predicativity and of autonomous progression is broken down if as the given infinite total entity we choose some mathematical entities other than the traditional ω. Namely, the equivalence between normal transfinite recursion scheme and new dependent transfinite recursion scheme, which does hold in the context of subsystems of second order number theory, does not hold in the context of subsystems of second order set theory where the universe V of sets is treated as the given totality (nor in the contexts of those of n+3-th order number or set theories, where the class of all n+2-th order objects is treated as the given totality).
Resumo:
We define an applicative theory of truth TPT which proves totality exactly for the polynomial time computable functions. TPT has natural and simple axioms since nearly all its truth axioms are standard for truth theories over an applicative framework. The only exception is the axiom dealing with the word predicate. The truth predicate can only reflect elementhood in the words for terms that have smaller length than a given word. This makes it possible to achieve the very low proof-theoretic strength. Truth induction can be allowed without any constraints. For these reasons the system TPT has the high expressive power one expects from truth theories. It allows embeddings of feasible systems of explicit mathematics and bounded arithmetic. The proof that the theory TPT is feasible is not easy. It is not possible to apply a standard realisation approach. For this reason we develop a new realisation approach whose realisation functions work on directed acyclic graphs. In this way, we can express and manipulate realisation information more efficiently.
Resumo:
Software dependencies play a vital role in programme comprehension, change impact analysis and other software maintenance activities. Traditionally, these activities are supported by source code analysis; however, the source code is sometimes inaccessible or difficult to analyse, as in hybrid systems composed of source code in multiple languages using various paradigms (e.g. object-oriented programming and relational databases). Moreover, not all stakeholders have adequate knowledge to perform such analyses. For example, non-technical domain experts and consultants raise most maintenance requests; however, they cannot predict the cost and impact of the requested changes without the support of the developers. We propose a novel approach to predicting software dependencies by exploiting the coupling present in domain-level information. Our approach is independent of the software implementation; hence, it can be used to approximate architectural dependencies without access to the source code or the database. As such, it can be applied to hybrid systems with heterogeneous source code or legacy systems with missing source code. In addition, this approach is based solely on information visible and understandable to domain users; therefore, it can be efficiently used by domain experts without the support of software developers. We evaluate our approach with a case study on a large-scale enterprise system, in which we demonstrate how up to 65 of the source code dependencies and 77% of the database dependencies are predicted solely based on domain information.
Resumo:
The domain of context-free languages has been extensively explored and there exist numerous techniques for parsing (all or a subset of) context-free languages. Unfortunately, some programming languages are not context-free. Using standard context-free parsing techniques to parse a context-sensitive programming language poses a considerable challenge. Im- plementors of programming language parsers have adopted various techniques, such as hand-written parsers, special lex- ers, or post-processing of an ambiguous parser output to deal with that challenge. In this paper we suggest a simple extension of a top-down parser with contextual information. Contrary to the tradi- tional approach that uses only the input stream as an input to a parsing function, we use a parsing context that provides ac- cess to a stream and possibly to other context-sensitive infor- mation. At a same time we keep the context-free formalism so a grammar definition stays simple without mind-blowing context-sensitive rules. We show that our approach can be used for various purposes such as indent-sensitive parsing, a high-precision island parsing or XML (with arbitrary el- ement names) parsing. We demonstrate our solution with PetitParser, a parsing-expression grammar based, top-down, parser combinator framework written in Smalltalk.
Resumo:
Code clone detection helps connect developers across projects, if we do it on a large scale. The cornerstones that allow clone detection to work on a large scale are: (1) bad hashing (2) lightweight parsing using regular expressions and (3) MapReduce pipelines. Bad hashing means to determine whether or not two artifacts are similar by checking whether their hashes are identical. We show a bad hashing scheme that works well on source code. Lightweight parsing using regular expressions is our technique of obtaining entire parse trees from regular expressions, robustly and efficiently. We detail the algorithm and implementation of one such regular expression engine. MapReduce pipelines are a way of expressing a computation such that it can automatically and simply be parallelized. We detail the design and implementation of one such MapReduce pipeline that is efficient and debuggable. We show a clone detector that combines these cornerstones to detect code clones across all projects, across all versions of each project.