884 resultados para Home-based work
Resumo:
Demosaicking is a particular case of interpolation problems where, from a scalar image in which each pixel has either the red, the green or the blue component, we want to interpolate the full-color image. State-of-the-art demosaicking algorithms perform interpolation along edges, but these edges are estimated locally. We propose a level-set-based geometric method to estimate image edges, inspired by the image in-painting literature. This method has a time complexity of O(S) , where S is the number of pixels in the image, and compares favorably with the state-of-the-art algorithms both visually and in most relevant image quality measures.
Resumo:
The emergence of the Web 2.0 technologies in the last years havechanged the way people interact with knowledge. Services for cooperation andcollaboration have placed the user in the centre of a new knowledge buildingspace. The development of new second generation learning environments canbenefit from the potential of these Web 2.0 services when applied to aneducational context. We propose a methodology for designing learningenvironments that relates Web 2.0 services with the functional requirements ofthese environments. In particular, we concentrate on the design of the KRSMsystem to discuss the components of this methodology and its application.
Resumo:
One of the most relevant difficulties faced by first-year undergraduate students is to settle into the educational environment of universities. This paper presents a case study that proposes a computer-assisted collaborative experience designed to help students in their transition from high school to university. This is done by facilitating their first contact with the campus and its services, the university community, methodologies and activities. The experience combines individual and collaborative activities, conducted in and out of the classroom, structured following the Jigsaw Collaborative Learning Flow Pattern. A specific environment including portable technologies with network and computer applications has been developed to support and facilitate the orchestration of a flow of learning activities into a single integrated learning setting. The result is a Computer-Supported Collaborative Blended Learning scenario, which has been evaluated with first-year university students of the degrees of Software and Audiovisual Engineering within the subject Introduction to Information and Communications Technologies. The findings reveal that the scenario improves significantly students’ interest in their studies and their understanding about the campus and services provided. The environment is also an innovative approach to successfully support the heterogeneous activities conducted by both teachers and students during the scenario. This paper introduces the goals and context of the case study, describes how the technology was employed to conduct the learning scenario, the evaluation methods and the main results of the experience.
Resumo:
It is well known that multiple-input multiple-output (MIMO) techniques can bring numerous benefits, such as higher spectral efficiency, to point-to-point wireless links. More recently, there has been interest in extending MIMO concepts tomultiuser wireless systems. Our focus in this paper is on network MIMO, a family of techniques whereby each end user in a wireless access network is served through several access points within its range of influence. By tightly coordinating the transmission and reception of signals at multiple access points, network MIMO can transcend the limits on spectral efficiency imposed by cochannel interference. Taking prior information-theoretic analyses of networkMIMO to the next level, we quantify the spectral efficiency gains obtainable under realistic propagation and operational conditions in a typical indoor deployment. Our study relies on detailed simulations and, for specificity, is conducted largely within the physical-layer framework of the IEEE 802.16e Mobile WiMAX system. Furthermore,to facilitate the coordination between access points, we assume that a high-capacity local area network, such as Gigabit Ethernet,connects all the access points. Our results confirm that network MIMO stands to provide a multiple-fold increase in spectralefficiency under these conditions.
Resumo:
This paper introduces Collage, a high-level IMS-LD compliant authoring tool that is specialized for CSCL (Computer-Supported Collaborative Learning). Nowadays CSCL is a key trend in elearning since it highlights the importance of social interactions as an essential element of learning. CSCL is an interdisciplinary domain, which demands participatory design techniques that allow teachers to get directly involved in design activities. Developing CSCL designs using LD is a difficult task for teachers since LD is a complex technical specification and modelling collaborative characteristics can be tricky. Collage helps teachers in the process of creating their own potentially effective collaborative Learning Designs by reusing and customizing patterns, according to the requirements of a particular learning situation. These patterns, called Collaborative Learning Flow Patterns (CLFPs), represent best practices that are repetitively used by practitioners when structuring the flow of (collaborative) learning activities. An example of an LD that can be created using Collage is illustrated in the paper. Preliminary evaluation results show that teachers, with experience in CL but without LD knowledge, can successfully design real collaborative learning experiences using Collage.
Resumo:
Computer based training or distance education are facing dramatic changes withthe advent of standardization efforts, some of them concentrating in maximal reuse.This is of paramount importance for a sustainable -cost affordable- production ofeducational materials. Reuse in itself should not be a goal, though, since manymethodological aspects might be lost. In this paper we propose two contentproduction approaches for the InterMediActor platform under a competence-basedmethodology: either a bottom-up approach where content is designed from scratchor a top-down methodology where existing material can be gradually adapted tofulfil requisites to be used with maximal flexibility into InterMediActor.
Resumo:
Student guidance is an always desired characteristic in any educational system, butit represents special difficulty if it has to be deployed in an automated way to fulfilsuch needs in a computer supported educational tool. In this paper we explorepossible avenues relying on machine learning techniques, to be included in a nearfuture -in the form of a tutoring navigational tool- in a teleeducation platform -InterMediActor- currently under development. Since no data from that platform isavailable yet, the preliminary experiments presented in this paper are builtinterpreting every subject in the Telecommunications Degree at Universidad CarlosIII de Madrid as an aggregated macro-competence (following the methodologicalconsiderations in InterMediActor), such that marks achieved by students can beused as data for the models, to be replaced in a near future by real data directlymeasured inside InterMediActor. We evaluate the predictability of students qualifications, and we deploy a preventive early detection system -failure alert-, toidentify those students more prone to fail a certain subject such that correctivemeans can be deployed with sufficient anticipation.
Resumo:
A systolic array to implement lattice-reduction-aided lineardetection is proposed for a MIMO receiver. The lattice reductionalgorithm and the ensuing linear detections are operated in the same array, which can be hardware-efficient. All-swap lattice reduction algorithm (ASLR) is considered for the systolic design.ASLR is a variant of the LLL algorithm, which processes all lattice basis vectors within one iteration. Lattice-reduction-aided linear detection based on ASLR and LLL algorithms have very similarbit-error-rate performance, while ASLR is more time efficient inthe systolic array, especially for systems with a large number ofantennas.
Resumo:
The paper presents a competence-based instructional design system and a way to provide a personalization of navigation in the course content. The navigation aid tool builds on the competence graph and the student model, which includes the elements of uncertainty in the assessment of students. An individualized navigation graph is constructed for each student, suggesting the competences the student is more prepared to study. We use fuzzy set theory for dealing with uncertainty. The marks of the assessment tests are transformed into linguistic terms and used for assigning values to linguistic variables. For each competence, the level of difficulty and the level of knowing its prerequisites are calculated based on the assessment marks. Using these linguistic variables and approximate reasoning (fuzzy IF-THEN rules), a crisp category is assigned to each competence regarding its level of recommendation.
Resumo:
There is increasing interest in the early years as a focus for reducing health inequalities as well as one that is important for the children themselves. This paper describes the introduction in England of Sure Start Local Programmes, which included home visiting within a community development approach, and an intensive home visiting programme, the Nurse-Family partnership, for disadvantaged teenage mothers. It reflects on changes and challenges in service provision to mothers and their pre-school children in England, explaining that a long tradition of home visiting was, paradoxically, reduced as attention focused on the newer initiatives. This is now being addressed, with attention to a range of evidence based programmes and a specific focus on heath visitor provision.
Resumo:
Abstract Sitting between your past and your future doesn't mean you are in the present. Dakota Skye Complex systems science is an interdisciplinary field grouping under the same umbrella dynamical phenomena from social, natural or mathematical sciences. The emergence of a higher order organization or behavior, transcending that expected of the linear addition of the parts, is a key factor shared by all these systems. Most complex systems can be modeled as networks that represent the interactions amongst the system's components. In addition to the actual nature of the part's interactions, the intrinsic topological structure of underlying network is believed to play a crucial role in the remarkable emergent behaviors exhibited by the systems. Moreover, the topology is also a key a factor to explain the extraordinary flexibility and resilience to perturbations when applied to transmission and diffusion phenomena. In this work, we study the effect of different network structures on the performance and on the fault tolerance of systems in two different contexts. In the first part, we study cellular automata, which are a simple paradigm for distributed computation. Cellular automata are made of basic Boolean computational units, the cells; relying on simple rules and information from- the surrounding cells to perform a global task. The limited visibility of the cells can be modeled as a network, where interactions amongst cells are governed by an underlying structure, usually a regular one. In order to increase the performance of cellular automata, we chose to change its topology. We applied computational principles inspired by Darwinian evolution, called evolutionary algorithms, to alter the system's topological structure starting from either a regular or a random one. The outcome is remarkable, as the resulting topologies find themselves sharing properties of both regular and random network, and display similitudes Watts-Strogtz's small-world network found in social systems. Moreover, the performance and tolerance to probabilistic faults of our small-world like cellular automata surpasses that of regular ones. In the second part, we use the context of biological genetic regulatory networks and, in particular, Kauffman's random Boolean networks model. In some ways, this model is close to cellular automata, although is not expected to perform any task. Instead, it simulates the time-evolution of genetic regulation within living organisms under strict conditions. The original model, though very attractive by it's simplicity, suffered from important shortcomings unveiled by the recent advances in genetics and biology. We propose to use these new discoveries to improve the original model. Firstly, we have used artificial topologies believed to be closer to that of gene regulatory networks. We have also studied actual biological organisms, and used parts of their genetic regulatory networks in our models. Secondly, we have addressed the improbable full synchronicity of the event taking place on. Boolean networks and proposed a more biologically plausible cascading scheme. Finally, we tackled the actual Boolean functions of the model, i.e. the specifics of how genes activate according to the activity of upstream genes, and presented a new update function that takes into account the actual promoting and repressing effects of one gene on another. Our improved models demonstrate the expected, biologically sound, behavior of previous GRN model, yet with superior resistance to perturbations. We believe they are one step closer to the biological reality.
Resumo:
Abstract Since its creation, the Internet has permeated our daily life. The web is omnipresent for communication, research and organization. This exploitation has resulted in the rapid development of the Internet. Nowadays, the Internet is the biggest container of resources. Information databases such as Wikipedia, Dmoz and the open data available on the net are a great informational potentiality for mankind. The easy and free web access is one of the major feature characterizing the Internet culture. Ten years earlier, the web was completely dominated by English. Today, the web community is no longer only English speaking but it is becoming a genuinely multilingual community. The availability of content is intertwined with the availability of logical organizations (ontologies) for which multilinguality plays a fundamental role. In this work we introduce a very high-level logical organization fully based on semiotic assumptions. We thus present the theoretical foundations as well as the ontology itself, named Linguistic Meta-Model. The most important feature of Linguistic Meta-Model is its ability to support the representation of different knowledge sources developed according to different underlying semiotic theories. This is possible because mast knowledge representation schemata, either formal or informal, can be put into the context of the so-called semiotic triangle. In order to show the main characteristics of Linguistic Meta-Model from a practical paint of view, we developed VIKI (Virtual Intelligence for Knowledge Induction). VIKI is a work-in-progress system aiming at exploiting the Linguistic Meta-Model structure for knowledge expansion. It is a modular system in which each module accomplishes a natural language processing task, from terminology extraction to knowledge retrieval. VIKI is a supporting system to Linguistic Meta-Model and its main task is to give some empirical evidence regarding the use of Linguistic Meta-Model without claiming to be thorough.
Resumo:
The purpose of the thesis was to explore expectations of elderly people on the nurse-client relationship and interaction in home care. The aim is to improve the quality of care to better meet the needs of the clients. A qualitative approach was adopted. Semi-structured theme interviews were used for data collection. The interviews were conducted during spring 2006. Six elderly clients of a private home care company in Southern Finland acted as informants. Content analysis was used as the method of data analysis. The findings suggest that clients expect nurses to provide professional care with loving-kindness. Trust and mutual, active interaction were expected from the nurse-client relationship. Clients considered it important that the nurse recognizes each client's individual needs. The nurse was expected to perform duties efficiently, but in a calm and unrushed manner. A mechanic performance of tasks was considered negative. Humanity was viewed as a crucial element in the nurse-client relationship. Clients expressed their need to be seen as human beings. Seeing beyond the illness was considered important. A smiling nurse was described to be able to alleviate pain and anxiety. Clients hoped to have a close relationship with the nurse. The development of a close relationship was considered to be more likely if the nurse is familiar and genuine. Clients wish the nurses to have a more attending presence. Clients suggested that the work areas of the nurses could be limited so that they would have more time to transfer from one place to another. Clients felt that they would benefit from this as well. The nurses were expected to be more considerate. Clients wished for more information regarding changes that affect their care. They wished to be informed about changes in schedules and plans. Clients hoped for continuity from the nurse-client relationship. Considering the expectations of clients promotes client satisfaction. Home care providers have an opportunity to reflect their own care behaviour on the findings. To better meet the needs of the clients, nurses could apply the concept of loving-kindness in their work, and strive for a more attending presence.
Resumo:
Acute exacerbation of COPD is one of the most common causes of hospital admission in patients affected with this disease. In most cases, consideration of differential diagnoses and assessment of important comorbidities will allow to make the decision whether or not the patient needs to be hospitalized. A decision to hospitalize will be based on specific symptoms and signs, as well on the patient's history. Contrary to bronchial asthma, a systematic action plan strategy is lacking for COPD. However, a disease management plan involving all the health care providers may have the potential to improve the patient's well being and to decrease costs related to these exacerbations.
Resumo:
The vast territories that have been radioactively contaminated during the 1986 Chernobyl accident provide a substantial data set of radioactive monitoring data, which can be used for the verification and testing of the different spatial estimation (prediction) methods involved in risk assessment studies. Using the Chernobyl data set for such a purpose is motivated by its heterogeneous spatial structure (the data are characterized by large-scale correlations, short-scale variability, spotty features, etc.). The present work is concerned with the application of the Bayesian Maximum Entropy (BME) method to estimate the extent and the magnitude of the radioactive soil contamination by 137Cs due to the Chernobyl fallout. The powerful BME method allows rigorous incorporation of a wide variety of knowledge bases into the spatial estimation procedure leading to informative contamination maps. Exact measurements (?hard? data) are combined with secondary information on local uncertainties (treated as ?soft? data) to generate science-based uncertainty assessment of soil contamination estimates at unsampled locations. BME describes uncertainty in terms of the posterior probability distributions generated across space, whereas no assumption about the underlying distribution is made and non-linear estimators are automatically incorporated. Traditional estimation variances based on the assumption of an underlying Gaussian distribution (analogous, e.g., to the kriging variance) can be derived as a special case of the BME uncertainty analysis. The BME estimates obtained using hard and soft data are compared with the BME estimates obtained using only hard data. The comparison involves both the accuracy of the estimation maps using the exact data and the assessment of the associated uncertainty using repeated measurements. Furthermore, a comparison of the spatial estimation accuracy obtained by the two methods was carried out using a validation data set of hard data. Finally, a separate uncertainty analysis was conducted that evaluated the ability of the posterior probabilities to reproduce the distribution of the raw repeated measurements available in certain populated sites. The analysis provides an illustration of the improvement in mapping accuracy obtained by adding soft data to the existing hard data and, in general, demonstrates that the BME method performs well both in terms of estimation accuracy as well as in terms estimation error assessment, which are both useful features for the Chernobyl fallout study.