856 resultados para ability to suspect phishing emails
Resumo:
In the latter half of the twentieth century the workforce dynamic changed when the number of women entering the workforce increased by record amounts. In direct opposition to this change was the inability of organizations to meet the needs of employees with childcare concerns. Organizations and employees alike are best served when policies, procedures, and benefits are implemented to achieve a positive work/life balance. Companies that institute benefits that are supportive to families observe decreases in turnover and increased employee retention. Employees who are offered family friendly resources have been known to stay with companies even when offered a higher salary elsewhere. Demonstrating that retention of valued employees is linked to an organizations ability to offer support for family needs.
Resumo:
Background: Despite the existence of ample literature dealing, on the one hand, with the integration of innovations within health systems and team learning, and, on the other hand, with different aspects of the detection and management of intimate partner violence (IPV) within healthcare facilities, research that explores how health innovations that go beyond biomedical issues—such as IPV management—get integrated into health systems, and that focuses on healthcare teams’ learning processes is, to the best of our knowledge, very scarce if not absent. This realist evaluation protocol aims to ascertain: why, how, and under what circumstances primary healthcare teams engage (if at all) in a learning process to integrate IPV management in their practices; and why, how, and under what circumstances team learning processes lead to the development of organizational culture and values regarding IPV management, and the delivery of IPV management services. Methods: This study will be conducted in Spain using a multiple-case study design. Data will be collected from selected cases (primary healthcare teams) through different methods: individual and group interviews, routinely collected statistical data, documentary review, and observation. Cases will be purposively selected in order to enable testing the initial middle-range theory (MRT). After in-depth exploration of a limited number of cases, additional cases will be chosen for their ability to contribute to refining the emerging MRT to explain how primary healthcare learn to integrate intimate partner violence management. Discussion: Evaluations of health sector responses to IPV are scarce, and even fewer focus on why, how, and when the healthcare services integrate IPV management. There is a consensus that healthcare professionals and healthcare teams play a key role in this integration, and that training is important in order to realize changes. However, little is known about team learning of IPV management, both in terms of how to trigger such learning and how team learning is connected with changes in organizational culture and values, and in service delivery. This realist evaluation protocol aims to contribute to this knowledge by conducting this project in a country, Spain, where great endeavours have been made towards the integration of IPV management within the health system.
Resumo:
Self-organising neural models have the ability to provide a good representation of the input space. In particular the Growing Neural Gas (GNG) is a suitable model because of its flexibility, rapid adaptation and excellent quality of representation. However, this type of learning is time-consuming, especially for high-dimensional input data. Since real applications often work under time constraints, it is necessary to adapt the learning process in order to complete it in a predefined time. This paper proposes a Graphics Processing Unit (GPU) parallel implementation of the GNG with Compute Unified Device Architecture (CUDA). In contrast to existing algorithms, the proposed GPU implementation allows the acceleration of the learning process keeping a good quality of representation. Comparative experiments using iterative, parallel and hybrid implementations are carried out to demonstrate the effectiveness of CUDA implementation. The results show that GNG learning with the proposed implementation achieves a speed-up of 6× compared with the single-threaded CPU implementation. GPU implementation has also been applied to a real application with time constraints: acceleration of 3D scene reconstruction for egomotion, in order to validate the proposal.
Resumo:
Modern compilers present a great and ever increasing number of options which can modify the features and behavior of a compiled program. Many of these options are often wasted due to the required comprehensive knowledge about both the underlying architecture and the internal processes of the compiler. In this context, it is usual, not having a single design goal but a more complex set of objectives. In addition, the dependencies between different goals are difficult to be a priori inferred. This paper proposes a strategy for tuning the compilation of any given application. This is accomplished by using an automatic variation of the compilation options by means of multi-objective optimization and evolutionary computation commanded by the NSGA-II algorithm. This allows finding compilation options that simultaneously optimize different objectives. The advantages of our proposal are illustrated by means of a case study based on the well-known Apache web server. Our strategy has demonstrated an ability to find improvements up to 7.5% and up to 27% in context switches and L2 cache misses, respectively, and also discovers the most important bottlenecks involved in the application performance.
Resumo:
Improvement of the features of an enzyme is in many instances a pre-requisite for the industrial implementation of these exceedingly interesting biocatalysts. To reach this goal, the researcher may utilize different tools. For example, amination of the enzyme surface produces an alteration of the isoelectric point of the protein along with its chemical reactivity (primary amino groups are the most widely used to obtain the reaction of the enzyme with surfaces, chemical modifiers, etc.) and even its “in vivo” behavior. This review will show some examples of chemical (mainly modifying the carboxylic groups using the carbodiimide route), physical (using polycationic polymers like polyethyleneimine) and genetic amination of the enzyme surface. Special emphasis will be put on cases where the amination is performed to improve subsequent protein modifications. Thus, amination has been used to increase the intensity of the enzyme/support multipoint covalent attachment, to improve the interaction with cation exchanger supports or polymers, or to promote the formation of crosslinkings (both intra-molecular and in the production of crosslinked enzyme aggregates). In other cases, amination has been used to directly modulate the enzyme properties (both in immobilized or free form). Amination of the enzyme surface may also pursue other goals not related to biocatalysis. For example, it has been used to improve the raising of antibodies against different compounds (both increasing the number of haptamers per enzyme and the immunogenicity of the composite) or the ability to penetrate cell membranes. Thus, amination may be a very powerful tool to improve the use of enzymes and proteins in many different areas and a great expansion of its usage may be expected in the near future.
Resumo:
In many classification problems, it is necessary to consider the specific location of an n-dimensional space from which features have been calculated. For example, considering the location of features extracted from specific areas of a two-dimensional space, as an image, could improve the understanding of a scene for a video surveillance system. In the same way, the same features extracted from different locations could mean different actions for a 3D HCI system. In this paper, we present a self-organizing feature map able to preserve the topology of locations of an n-dimensional space in which the vector of features have been extracted. The main contribution is to implicitly preserving the topology of the original space because considering the locations of the extracted features and their topology could ease the solution to certain problems. Specifically, the paper proposes the n-dimensional constrained self-organizing map preserving the input topology (nD-SOM-PINT). Features in adjacent areas of the n-dimensional space, used to extract the feature vectors, are explicitly in adjacent areas of the nD-SOM-PINT constraining the neural network structure and learning. As a study case, the neural network has been instantiate to represent and classify features as trajectories extracted from a sequence of images into a high level of semantic understanding. Experiments have been thoroughly carried out using the CAVIAR datasets (Corridor, Frontal and Inria) taken into account the global behaviour of an individual in order to validate the ability to preserve the topology of the two-dimensional space to obtain high-performance classification for trajectory classification in contrast of non-considering the location of features. Moreover, a brief example has been included to focus on validate the nD-SOM-PINT proposal in other domain than the individual trajectory. Results confirm the high accuracy of the nD-SOM-PINT outperforming previous methods aimed to classify the same datasets.
Resumo:
In this brief petition of John Wyeth to the Harvard Corporation, he requests the ability to borrow books from the "Publick Library" of the College.
Resumo:
There has been a tremendous increase in our knowledge of hum motor performance over the last few decades. Our theoretical understanding of how an individual learns to move is sophisticated and complex. It is difficult however to relate much of this information in practical terms to physical educators, coaches, and therapists concerned with the learning of motor skills (Shumway-Cook & Woolcott, 1995). Much of our knowledge stems from lab testing which often appears to bear little relation to real-life situations. This lack of ecological validity has slowed the flow of information from the theorists and researchers to the practitioners. This paper is concerned with taking some small aspects of motor learning theory, unifying them, and presenting them in a usable fashion. The intention is not to present a recipe for teaching motor skills, but to present a framework from which solutions can be found. If motor performance research has taught us anything, it is that every individual and situation presents unique challenges. By increasing our ability to conceptualize the learning situation we should be able to develop more flexible and adaptive responses to the challege of teaching motor skills. The model presented here allows a teacher, coach, or therapist to use readily available observations and known characteristics about a motor task and to conceptualize them in a manner which allows them to make appropriate teaching/learning decisions.
Resumo:
We investigated whether children’s inhibitory control is associated with their ability to produce irregular verb forms as well as learn from corrective feedback following their use of an over-regularized form. Forty-eight 3.5 to 4.5 year old children were tested on the irregular past tense and provided with adult corrective input via models of correct use or recasts of errors following ungrammatical responses. Inhibitory control was assessed with a three-item battery of tasks that required suppressing a prepotent response in favor of a non-canonical one. Results showed that inhibitory control was predictive of children’s initial production of irregular forms and not associated with their post-feedback production of irregulars. These findings show that children’s executive functioning skills may be a rate-limiting factor on their ability to produce correct forms, but might not interact with their ability to learn from input in this domain. Findings are discussed in terms of current theories of past-tense acquisition and learning from input more broadly.
Resumo:
Trabalho Final do Curso de Mestrado Integrado em Medicina, Faculdade de Medicina, Universidade de Lisboa, 2014
Resumo:
The most straightforward European single energy market design would entail a European system operator regulated by a single European regulator. This would ensure the predictable development of rules for the entire EU, significantly reducing regulatory uncertainty for electricity sector investments. But such a first-best market design is unlikely to be politically realistic in the European context for three reasons. First, the necessary changes compared to the current situation are substantial and would produce significant redistributive effects. Second, a European solution would deprive member states of the ability to manage their energy systems nationally. And third, a single European solution might fall short of being well-tailored to consumers’ preferences, which differ substantially across the EU. To nevertheless reap significant benefits from an integrated European electricity market, we propose the following blueprint: First, we suggest adding a European system-management layer to complement national operation centres and help them to better exchange information about the status of the system, expected changes and planned modifications. The ultimate aim should be to transfer the day-to-day responsibility for the safe and economic operation of the system to the European control centre. To further increase efficiency, electricity prices should be allowed to differ between all network points between and within countries. This would enable throughput of electricity through national and international lines to be safely increased without any major investments in infrastructure. Second, to ensure the consistency of national network plans and to ensure that they contribute to providing the infrastructure for a functioning single market, the role of the European ten year network development plan (TYNDP) needs to be upgraded by obliging national regulators to only approve projects planned at European level unless they can prove that deviations are beneficial. This boosted role of the TYNDP would need to be underpinned by resolving the issues of conflicting interests and information asymmetry. Therefore, the network planning process should be opened to all affected stakeholders (generators, network owners and operators, consumers, residents and others) and enable the European Agency for the Cooperation of Energy Regulators (ACER) to act as a welfare-maximising referee. An ultimate political decision by the European Parliament on the entire plan will open a negotiation process around selecting alternatives and agreeing compensation. This ensures that all stakeholders have an interest in guaranteeing a certain degree of balance of interest in the earlier stages. In fact, transparent planning, early stakeholder involvement and democratic legitimisation are well suited for minimising as much as possible local opposition to new lines. Third, sharing the cost of network investments in Europe is a critical issue. One reason is that so far even the most sophisticated models have been unable to identify the individual long-term net benefit in an uncertain environment. A workable compromise to finance new network investments would consist of three components: (i) all easily attributable cost should be levied on the responsible party; (ii) all network users that sit at nodes that are expected to receive more imports through a line extension should be obliged to pay a share of the line extension cost through their network charges; (iii) the rest of the cost is socialised to all consumers. Such a cost-distribution scheme will involve some intra-European redistribution from the well-developed countries (infrastructure-wise) to those that are catching up. However, such a scheme would perform this redistribution in a much more efficient way than the Connecting Europe Facility’s ad-hoc disbursements to politically chosen projects, because it would provide the infrastructure that is really needed.
Resumo:
The future of Europe 2020 lies in its ability to become the protagonist of a new season in EU policy, in which countries can apply for more flexibility only if they can prove both structural reform and good governance, argues the author. By establishing a ‘new deal’ among member states, an improved Europe 2020 strategy can help Europe to complete its transition from austerity to prosperity. This Policy Brief makes the case for approaching the mid-term review of Europe 2020 on three different levels: i) the revision and update of the content of the Europe 2020 strategy, including its objectives, targets and major flagship initiatives; ii) the reform of the governance of the strategy; and iii) the repositioning of the strategy at the core of EU policy. The content of the strategy should be revised to include initiatives on infrastructure, the internal market and administrative capacity at all levels of government. The author sets out a number of policy recommendations for the European Commission and the member states to help realise these objectives.
Resumo:
While acknowledging that the sustainability of sovereign debt is a serious issue that must be confronted, this EuropEos Commentary finds that financial markets have blown the problem completely out of proportion, leading to a full-scale confidence crisis. The authors present evidence suggesting that politicians’ public disagreements and careless statements at critical junctures may have added oil to incipient fire. By creating the impression that domestic political interests would take precedence over orderly management of the Greek debt crisis, they raised broader doubts about their ability to address fundamental economic divergences within the area, which are the real source of debt sustainability problems in the medium term.
Resumo:
In this paper we try to present the main trends of evolution of the ICT sector. Its dynamics, supported by a constant technical progress in ICs, compounded with “non convexities” such as network effects and high sunk costs, may either lead to a Schumpeter Mark I or Schumpeter Mark II competition regime. This means that in some segments, the market will be more competitive (Mark I), while in other it will be more monopolistic (Mark II). But a key trend is also the so called “convergence”. But digitization makes it cost effective to integrate different communications, information processing and entertainment systems and devices. Hence, Schumpeter Mark II grows at the core where software production dominates, while Schumpeter Mark I is established at the periphery. In this context, the European ICT industry is potentially smashed between two forces: the cost advantages of Asian countries on one hand, the inventiveness and dynamism of the US industry on the other hand. The way out of this very difficult situation is to create in Europe the conditions of restoring knowledge accumulation in a key sub-sector of ICT, that is software production. To do this, Europe can rely on its tradition of cooperation and knowledge sharing and on a set of institutions that have shown their ability to stimulate inter-regional cooperation. By concentrating on an ambitious project of open source software production in embarked systems and domestic networks, Europe could reach several objectives: to make freely accessible an essential facility, to stimulate competition, to help reaching the Lisbon objectives and to restore the European competitiveness in ICT.
Resumo:
International trade in textiles and apparel has, as of January 1, 2005, been set free from the very intricate Multi-Fiber textile and apparel quota Arrangement (MFA). This event has raised many uncertainties about the new international trade climate and has placed enormous pressure on China as the expected clear cut beneficiary of this liberalization.' Other countries considered to be major contenders include Vietnam which also has a large population employed in the textile and apparel (T&A) sector. Since the old quota system had provided a certain degree of market certainty to competing T&A producers, will the new free trade environment lead to a shake out where mass producers with large economies of scale dominate the new reality? The removal of T&A quotas will create opportunities for Vietnam and China along with other developing countries, but it will also expose them to additional competition from each other. The outcome of this competition will depend on the demand in the US, the ability of the exporting countries to differentiate their exports and on their ability to transfer additional resources to expand domestic output in the direction of the new 'free market signals' and away from rent seeking objectives. Obviously, exporting countries that adjust to this new environment quickly will improve their competitiveness, and will be the new beneficiaries of a quota free international trade in textiles and apparel. This paper attempts to shed some light on the differences and similarities in the responses of Chinese and Vietnamese T&A sectors to this new environment. It first focuses on the demand side attempting to determine whether or not Chinese and Vietnamese T&A items, formally under quota control, are substitutes or compliments. On the supply side, the paper focuses on institutional differences between each country's T&A sectors, the different domestic government policies that have contributed to their growth and the unique cultural differences which will determine the future progress in each country's T&A sectors.