859 resultados para Design for flexibility in use
Resumo:
Technology advances in recent years have dramatically changed the way users exploit contents and services available on the Internet, by enforcing pervasive and mobile computing scenarios and enabling access to networked resources almost from everywhere, at anytime, and independently of the device in use. In addition, people increasingly require to customize their experience, by exploiting specific device capabilities and limitations, inherent features of the communication channel in use, and interaction paradigms that significantly differ from the traditional request/response one. So-called Ubiquitous Internet scenario calls for solutions that address many different challenges, such as device mobility, session management, content adaptation, context-awareness and the provisioning of multimodal interfaces. Moreover, new service opportunities demand simple and effective ways to integrate existing resources into new and value added applications, that can also undergo run-time modifications, according to ever-changing execution conditions. Despite service-oriented architectural models are gaining momentum to tame the increasing complexity of composing and orchestrating distributed and heterogeneous functionalities, existing solutions generally lack a unified approach and only provide support for specific Ubiquitous Internet aspects. Moreover, they usually target rather static scenarios and scarcely support the dynamic nature of pervasive access to Internet resources, that can make existing compositions soon become obsolete or inadequate, hence in need of reconfiguration. This thesis proposes a novel middleware approach to comprehensively deal with Ubiquitous Internet facets and assist in establishing innovative application scenarios. We claim that a truly viable ubiquity support infrastructure must neatly decouple distributed resources to integrate and push any kind of content-related logic outside its core layers, by keeping only management and coordination responsibilities. Furthermore, we promote an innovative, open, and dynamic resource composition model that allows to easily describe and enforce complex scenario requirements, and to suitably react to changes in the execution conditions.
Resumo:
Healthcare, Human Computer Interfaces (HCI), Security and Biometry are the most promising application scenario directly involved in the Body Area Networks (BANs) evolution. Both wearable devices and sensors directly integrated in garments envision a word in which each of us is supervised by an invisible assistant monitoring our health and daily-life activities. New opportunities are enabled because improvements in sensors miniaturization and transmission efficiency of the wireless protocols, that achieved the integration of high computational power aboard independent, energy-autonomous, small form factor devices. Application’s purposes are various: (I) data collection to achieve off-line knowledge discovery; (II) user notification of his/her activities or in case a danger occurs; (III) biofeedback rehabilitation; (IV) remote alarm activation in case the subject need assistance; (V) introduction of a more natural interaction with the surrounding computerized environment; (VI) users identification by physiological or behavioral characteristics. Telemedicine and mHealth [1] are two of the leading concepts directly related to healthcare. The capability to borne unobtrusiveness objects supports users’ autonomy. A new sense of freedom is shown to the user, not only supported by a psychological help but a real safety improvement. Furthermore, medical community aims the introduction of new devices to innovate patient treatments. In particular, the extension of the ambulatory analysis in the real life scenario by proving continuous acquisition. The wide diffusion of emerging wellness portable equipment extended the usability of wearable devices also for fitness and training by monitoring user performance on the working task. The learning of the right execution techniques related to work, sport, music can be supported by an electronic trainer furnishing the adequate aid. HCIs made real the concept of Ubiquitous, Pervasive Computing and Calm Technology introduced in the 1988 by Marc Weiser and John Seeley Brown. They promotes the creation of pervasive environments, enhancing the human experience. Context aware, adaptive and proactive environments serve and help people by becoming sensitive and reactive to their presence, since electronics is ubiquitous and deployed everywhere. In this thesis we pay attention to the integration of all the aspects involved in a BAN development. Starting from the choice of sensors we design the node, configure the radio network, implement real-time data analysis and provide a feedback to the user. We present algorithms to be implemented in wearable assistant for posture and gait analysis and to provide assistance on different walking conditions, preventing falls. Our aim, expressed by the idea to contribute at the development of a non proprietary solutions, driven us to integrate commercial and standard solutions in our devices. We use sensors available on the market and avoided to design specialized sensors in ASIC technologies. We employ standard radio protocol and open source projects when it was achieved. The specific contributions of the PhD research activities are presented and discussed in the following. • We have designed and build several wireless sensor node providing both sensing and actuator capability making the focus on the flexibility, small form factor and low power consumption. The key idea was to develop a simple and general purpose architecture for rapid analysis, prototyping and deployment of BAN solutions. Two different sensing units are integrated: kinematic (3D accelerometer and 3D gyroscopes) and kinetic (foot-floor contact pressure forces). Two kind of feedbacks were implemented: audio and vibrotactile. • Since the system built is a suitable platform for testing and measuring the features and the constraints of a sensor network (radio communication, network protocols, power consumption and autonomy), we made a comparison between Bluetooth and ZigBee performance in terms of throughput and energy efficiency. Test in the field evaluate the usability in the fall detection scenario. • To prove the flexibility of the architecture designed, we have implemented a wearable system for human posture rehabilitation. The application was developed in conjunction with biomedical engineers who provided the audio-algorithms to furnish a biofeedback to the user about his/her stability. • We explored off-line gait analysis of collected data, developing an algorithm to detect foot inclination in the sagittal plane, during walk. • In collaboration with the Wearable Lab – ETH, Zurich, we developed an algorithm to monitor the user during several walking condition where the user carry a load. The remainder of the thesis is organized as follows. Chapter I gives an overview about Body Area Networks (BANs), illustrating the relevant features of this technology and the key challenges still open. It concludes with a short list of the real solutions and prototypes proposed by academic research and manufacturers. The domain of the posture and gait analysis, the methodologies, and the technologies used to provide real-time feedback on detected events, are illustrated in Chapter II. The Chapter III and IV, respectively, shown BANs developed with the purpose to detect fall and monitor the gait taking advantage by two inertial measurement unit and baropodometric insoles. Chapter V reports an audio-biofeedback system to improve balance on the information provided by the use centre of mass. A walking assistant based on the KNN classifier to detect walking alteration on load carriage, is described in Chapter VI.
Resumo:
The increasing precision of current and future experiments in high-energy physics requires a likewise increase in the accuracy of the calculation of theoretical predictions, in order to find evidence for possible deviations of the generally accepted Standard Model of elementary particles and interactions. Calculating the experimentally measurable cross sections of scattering and decay processes to a higher accuracy directly translates into including higher order radiative corrections in the calculation. The large number of particles and interactions in the full Standard Model results in an exponentially growing number of Feynman diagrams contributing to any given process in higher orders. Additionally, the appearance of multiple independent mass scales makes even the calculation of single diagrams non-trivial. For over two decades now, the only way to cope with these issues has been to rely on the assistance of computers. The aim of the xloops project is to provide the necessary tools to automate the calculation procedures as far as possible, including the generation of the contributing diagrams and the evaluation of the resulting Feynman integrals. The latter is based on the techniques developed in Mainz for solving one- and two-loop diagrams in a general and systematic way using parallel/orthogonal space methods. These techniques involve a considerable amount of symbolic computations. During the development of xloops it was found that conventional computer algebra systems were not a suitable implementation environment. For this reason, a new system called GiNaC has been created, which allows the development of large-scale symbolic applications in an object-oriented fashion within the C++ programming language. This system, which is now also in use for other projects besides xloops, is the main focus of this thesis. The implementation of GiNaC as a C++ library sets it apart from other algebraic systems. Our results prove that a highly efficient symbolic manipulator can be designed in an object-oriented way, and that having a very fine granularity of objects is also feasible. The xloops-related parts of this work consist of a new implementation, based on GiNaC, of functions for calculating one-loop Feynman integrals that already existed in the original xloops program, as well as the addition of supplementary modules belonging to the interface between the library of integral functions and the diagram generator.
Resumo:
Constant developments in the field of offshore wind energy have increased the range of water depths at which wind farms are planned to be installed. Therefore, in addition to monopile support structures suitable in shallow waters (up to 30 m), different types of support structures, able to withstand severe sea conditions at the greater water depths, have been developed. For water depths above 30 m, the jacket is one of the preferred support types. Jacket represents a lightweight support structure, which, in combination with complex nature of environmental loads, is prone to highly dynamic behavior. As a consequence, high stresses with great variability in time can be observed in all structural members. The highest concentration of stresses occurs in joints due to their nature (structural discontinuities) and due to the existence of notches along the welds present in the joints. This makes them the weakest elements of the jacket in terms of fatigue. In the numerical modeling of jackets for offshore wind turbines, a reduction of local stresses at the chord-brace joints, and consequently an optimization of the model, can be achieved by implementing joint flexibility in the chord-brace joints. Therefore, in this work, the influence of joint flexibility on the fatigue damage in chord-brace joints of a numerical jacket model, subjected to advanced load simulations, is studied.
Resumo:
This thesis describes the investigation of systematically varied organic molecules for use in molecular self-assembly processes. All experiments were performed using high-resolution non-contact atomic force microscopy under UHV conditions and at room temperature. Using this technique, three different approaches for influencing intermolecular and molecule-surface interaction on the insulating calcite(10.4) surface were investigated by imaging the structure formation at the molecular scale. I first demonstrated the functionalization of shape-persistent oligo(p-benzamide)s that was engineered by introducing different functional groups and investigating their effect on the structural formation on the sample surface. The molecular core was designed to provide significant electrostatic anchoring towards the surface, while at the same time maintaining the flexibility to fine-tune the resulting structure by adjusting the intermolecular cohesion energy. The success of this strategy is based on a clear separation of the molecule-substrate interaction from the molecule-molecule interaction. My results show that sufficient molecule-surface anchoring can be achieved without restricting the structural flexibility that is needed for the design of complex molecular systems. Three derivatives of terephthalic acid (TPA) were investigated in chapter 7. Here, the focus was on changing the adhesion to the calcite surface by introducing different anchor functionalities to the TPA backbone. For all observed molecules, the strong substrate templating effect results in molecular structures that are strictly oriented along the calcite main crystal directions. This templating is especially pronounced in the case of 2-ATPA where chain formation on the calcite surface is observed in contrast to the formation of molecular layers in the bulk. At the same time, the amino group of 2-ATPA proved an efficient anchor functionality, successfully stabilizing the molecular chains on the sample surface. These findings emphasizes, once again, the importance of balancing and fine-tuning molecule-molecule and molecule-surface interactions in order to achieve stable, yet structurally flexible molecular arrangements on the sample surface. In the last chapter, I showed how the intrinsic property of molecular chirality decisively influences the structure formation in molecular self-assembly. This effect is especially pronounced in the case of the chiral heptahelicene-2-carboxylic acid. Deposition of the enantiopure molecules results in the formation of homochiral islands on the sample surface which is in sharp contrast to the formation of uni-directional double rows upon deposition of the racemate onto the same surface. While it remained uncertain from these previous experiments whether the double rows are composed of hetero- or homochiral molecules, I could clearly answer that question here and demonstrate that the rows are of heterochiral origin. Chirality, thus, proves to be another important parameter to steer the intermolecular interaction on surfaces. Altogether, the results of this thesis demonstrate that, in order to successfully control the structure formation in molecular self-assembly, the correct combination of molecule and surface properties is crucial. This is of special importance when working on substrates that exhibit a strong influence on the structure formation, such as the calcite(10.4) surface. Through the systematic variation of functional groups several important parameters that influence the balance between molecule-surface and molecule-molecule interaction were identified here, and the results of this thesis can, thus, act as a guideline for the rational design of molecules for use in molecular self-assembly.
Resumo:
Recent decades have seen both what has been referred to as an "inflation of historical monuments" and an acceleration of the process of "monumentification" affecting buildings of relatively recent date. In order to gain a better understanding of this, Kovacs looked at the experience in countries of Central Europe (Romania, Hungary, Slovenia, the Czech Lands, Slovakia), discovering a number of similarities as well as differences in detail. More important, however, was the discovery of the much wider importance of this phenomenon as a whole, which is particularly visible in this part of Europe, where "European" theory and practice of monument preservation are combined with progressivist demolitionism and traditional "natural" attitudes towards the built environment. Kovacs found that monument preservation has not only become a major occupation within building activity seen as a matter of anthropology, but also seems to be the determining feature of the contemporary cultural attitude. The scale of preservation activity has long since reached the level of urban design as an essential criterion for matters of future development, making it necessary to extend the conclusions of theoretical research down to broader generalities of the building domain. Kovacs then looked at the specific features of the countries concerned, including the survival of traditional building techniques in Romania, and the wide variety of preservationist policies in use.
Resumo:
Hybrid MIMO Phased-Array Radar (HMPAR) is an emerging technology that combines MIMO (multiple-in, multiple-out) radar technology with phased-array radar technology. The new technology is in its infancy, but much of the theoretical work for this specific project has already been completed and is explored in great depth in [1]. A brief overview of phased-array radar systems, MIMO radar systems, and the HMPAR paradigm are explored in this paper. This report is the culmination of an effort to support research in MIMO and HMPAR utilizing a concept called intrapulse beamscan. Using intrapulse beamscan, arbitrary spatial coverage can be achieved within one MIMO beam pulse. Therefore, this report focuses on designing waveforms for MIMO radar systems with arbitrary spatial coverage using that phenomenon. With intrapulse beamscan, scanning is done through phase-modulated signal design within one pulse rather than phase-shifters in the phased array over multiple pulses. In addition to using this idea, continuous phase modulation (CPM) signals are considered for their desirable peak-to-average ratio property as well as their low spectral leakage. These MIMO waveforms are designed with three goals in mind. The first goal is to achieve flexible spatial coverage while utilizing intrapulse beamscan. As with almost any radar system, we wish to have flexibility in where we send our signal energy. The second goal is to maintain a peak-to-average ratio close to 1 on the envelope of these waveforms, ensuring a signal that is close to constant modulus. It is desired to have a radar system transmit at the highest available power; not doing so would further diminish the already very small return signals. The third goal is to ensure low spectral leakage using various techniques to limit the bandwidth of the designed signals. Spectral containment is important to avoid interference with systems that utilize nearby frequencies in the electromagnetic spectrum. These three goals are realized allowing for limitations of real radar systems. In addition to flexible spatial coverage, the report examines the spectral properties of utilizing various space-filling techniques for desired spatial areas. The space-filling techniques examined include Hilbert/Peano curves and standard raster scans.
Resumo:
Abstract Radiation metabolomics employing mass spectral technologies represents a plausible means of high-throughput minimally invasive radiation biodosimetry. A simplified metabolomics protocol is described that employs ubiquitous gas chromatography-mass spectrometry and open source software including random forests machine learning algorithm to uncover latent biomarkers of 3 Gy gamma radiation in rats. Urine was collected from six male Wistar rats and six sham-irradiated controls for 7 days, 4 prior to irradiation and 3 after irradiation. Water and food consumption, urine volume, body weight, and sodium, potassium, calcium, chloride, phosphate and urea excretion showed major effects from exposure to gamma radiation. The metabolomics protocol uncovered several urinary metabolites that were significantly up-regulated (glyoxylate, threonate, thymine, uracil, p-cresol) and down-regulated (citrate, 2-oxoglutarate, adipate, pimelate, suberate, azelaate) as a result of radiation exposure. Thymine and uracil were shown to derive largely from thymidine and 2'-deoxyuridine, which are known radiation biomarkers in the mouse. The radiation metabolomic phenotype in rats appeared to derive from oxidative stress and effects on kidney function. Gas chromatography-mass spectrometry is a promising platform on which to develop the field of radiation metabolomics further and to assist in the design of instrumentation for use in detecting biological consequences of environmental radiation release.
Resumo:
The goal of this article was to study teachers' professional development related to web-based learning in the context of the teacher community. The object was to learn in what kind of networks teachers share the knowledge of web-based learning and what are the factors in the community that support or challenge teachers professional development of web-based learning. The findings of the study revealed that there are teachers who are especially active, called the central actors in this study, in the teacher community who collaborate and share knowledge of web-based learning. These central actors share both technical and pedagogical knowledge of web-based learning in networks that include both internal and external relations in the community and involve people, artefacts and a variety of media. Furthermore, the central actors appear to bridge different fields of teaching expertise in their community. According to the central actors' experiences the important factors that support teachers' professional development of web-based learning in the community are; the possibility to learn from colleagues and from everyday working practices, an emotionally safe atmosphere, the leader's personal support and community-level commitment. Also, the flexibility in work planning, challenging pupils, shared lessons with colleagues, training events in an authentic work environment and colleagues' professionalism are considered meaningful for professional development. As challenges, the knowledge sharing of web-based learning in the community needs mutual interests, transactive memory, time and facilities, peer support, a safe atmosphere and meaningful pedagogical practices. On the basis of the findings of the study it is suggested that by intensive collaboration related to web-based learning it may be possible to break the boundaries of individual teachership and create such sociocultural activities which support collaborative professional development in the teacher community. Teachers' in-service training programs should be more sensitive to the culture of teacher communities and teachers' reciprocal relations. Further, teacher trainers should design teachers' in-service training of web-based learning in co-evolution with supporting networks which include the media and artefacts as well as people.
Resumo:
The goal of this article was to study teachers' professional development related to web-based learning in the context of the teacher community. The object was to learn in what kind of networks teachers share the knowledge of web-based learning and what are the factors in the community that support or challenge teachers professional development of web-based learning. The findings of the study revealed that there are teachers who are especially active, called the central actors in this study, in the teacher community who collaborate and share knowledge of web-based learning. These central actors share both technical and pedagogical knowledge of web-based learning in networks that include both internal and external relations in the community and involve people, artefacts and a variety of media. Furthermore, the central actors appear to bridge different fields of teaching expertise in their community. According to the central actors' experiences the important factors that support teachers' professional development of web-based learning in the community are; the possibility to learn from colleagues and from everyday working practices, an emotionally safe atmosphere, the leader's personal support and community-level commitment. Also, the flexibility in work planning, challenging pupils, shared lessons with colleagues, training events in an authentic work environment and colleagues' professionalism are considered meaningful for professional development. As challenges, the knowledge sharing of web-based learning in the community needs mutual interests, transactive memory, time and facilities, peer support, a safe atmosphere and meaningful pedagogical practices. On the basis of the findings of the study it is suggested that by intensive collaboration related to web-based learning it may be possible to break the boundaries of individual teachership and create such sociocultural activities which support collaborative professional development in the teacher community. Teachers' in-service training programs should be more sensitive to the culture of teacher communities and teachers' reciprocal relations. Further, teacher trainers should design teachers' in-service training of web-based learning in co-evolution with supporting networks which include the media and artefacts as well as people.
Resumo:
The goal of this article was to study teachers' professional development related to web-based learning in the context of the teacher community. The object was to learn in what kind of networks teachers share the knowledge of web-based learning and what are the factors in the community that support or challenge teachers professional development of web-based learning. The findings of the study revealed that there are teachers who are especially active, called the central actors in this study, in the teacher community who collaborate and share knowledge of web-based learning. These central actors share both technical and pedagogical knowledge of web-based learning in networks that include both internal and external relations in the community and involve people, artefacts and a variety of media. Furthermore, the central actors appear to bridge different fields of teaching expertise in their community. According to the central actors' experiences the important factors that support teachers' professional development of web-based learning in the community are; the possibility to learn from colleagues and from everyday working practices, an emotionally safe atmosphere, the leader's personal support and community-level commitment. Also, the flexibility in work planning, challenging pupils, shared lessons with colleagues, training events in an authentic work environment and colleagues' professionalism are considered meaningful for professional development. As challenges, the knowledge sharing of web-based learning in the community needs mutual interests, transactive memory, time and facilities, peer support, a safe atmosphere and meaningful pedagogical practices. On the basis of the findings of the study it is suggested that by intensive collaboration related to web-based learning it may be possible to break the boundaries of individual teachership and create such sociocultural activities which support collaborative professional development in the teacher community. Teachers' in-service training programs should be more sensitive to the culture of teacher communities and teachers' reciprocal relations. Further, teacher trainers should design teachers' in-service training of web-based learning in co-evolution with supporting networks which include the media and artefacts as well as people.
Resumo:
The three-step test is central to the regulation of copyright limitations at the international level. Delineating the room for exemptions with abstract criteria, the three-step test is by far the most important and comprehensive basis for the introduction of national use privileges. It is an essential, flexible element in the international limitation infrastructure that allows national law makers to satisfy domestic social, cultural, and economic needs. Given the universal field of application that follows from the test’s open-ended wording, the provision creates much more breathing space than the more specific exceptions recognized in international copyright law. EC copyright legislation, however, fails to take advantage of the flexibility inherent in the three-step test. Instead of using the international provision as a means to open up the closed EC catalogue of permissible exceptions, offer sufficient breathing space for social, cultural, and economic needs, and enable EC copyright law to keep pace with the rapid development of the Internet, the Copyright Directive 2001/29/EC encourages the application of the three-step test to further restrict statutory exceptions that are often defined narrowly in national legislation anyway. In the current online environment, however, enhanced flexibility in the field of copyright limitations is indispensable. From a social and cultural perspective, the web 2.0 promotes and enhances freedom of expression and information with its advanced search engine services, interactive platforms, and various forms of user-generated content. From an economic perspective, it creates a parallel universe of traditional content providers relying on copyright protection, and emerging Internet industries whose further development depends on robust copyright limita- tions. In particular, the newcomers in the online market – social networking sites, video forums, and virtual worlds – promise a remarkable potential for economic growth that has already attracted the attention of the OECD. Against this background, the time is ripe to debate the introduction of an EC fair use doctrine on the basis of the three-step test. Otherwise, EC copyright law is likely to frustrate important opportunities for cultural, social, and economic development. To lay groundwork for the debate, the differences between the continental European and the Anglo-American approach to copyright limitations (section 1), and the specific merits of these two distinct approaches (section 2), will be discussed first. An analysis of current problems that have arisen under the present dysfunctional EC system (section 3) will then serve as a starting point for proposing an EC fair use doctrine based on the three-step test (section 4). Drawing conclusions, the international dimension of this fair use proposal will be considered (section 5).
Resumo:
The use of non-heart-beating donor (NHBD) lungs may help to overcome the shortage of lung grafts in clinical lung transplantation, but warm ischaemia and ischaemia/reperfusion injury (I/R injury) resulting in primary graft dysfunction represent a considerable threat. Thus, better strategies for optimized preservation of lung grafts are urgently needed. Surfactant dysfunction has been shown to contribute to I/R injury, and surfactant replacement therapy is effective in enhancing lung function and structural integrity in related rat models. In the present study we hypothesize that surfactant replacement therapy reduces oedema formation in a pig model of NHBD lung transplantation. Oedema formation was quantified with (SF) and without (non-SF) surfactant replacement therapy in interstitial and alveolar compartments by means of design-based stereology in NHBD lungs 7 h after cardiac arrest, reperfusion and transplantation. A sham-operated group served as control. In both NHBD groups, nearly all animals died within the first hours after transplantation due to right heart failure. Both SF and non-SF developed an interstitial oedema of similar degree, as shown by an increase in septal wall volume and arithmetic mean thickness as well as an increase in the volume of peribron-chovascular connective tissue. Regarding intra-alveolar oedema, no statistically significant difference could be found between SF and non-SF. In conclusion, surfactant replacement therapy cannot prevent poor outcome after prolonged warm ischaemia of 7 h in this model. While the beneficial effects of surfactant replacement therapy have been observed in several experimental and clinical studies related to heart-beating donor lungs and cold ischaemia, it is unlikely that surfactant replacement therapy will overcome the shortage of organs in the context of prolonged warm ischaemia, for example, 7 h. Moreover, our data demonstrate that right heart function and dysfunctions of the pulmonary vascular bed are limiting factors that need to be addressed in NHBD.
Resumo:
Introduction: Mindfulness based cognitive therapy for depression (MBCT) has shown to be effective for the reduction of depressive relapse. However, additional information regarding baseline patient characteristics and process features related to positive response could be helpful both for the provision of MBCT in clinical practice, as well as for its further development. Method: Baseline characteristics, process data, and immediate outcome (symptom change, change in attitudes and trait mindfulness) of 108 patients receiving MBCT in routine care were recorded. A newly developed self-report measure (Daily Mindfulness Scale, DMS) was applied daily during the MBCT program. Additionally, patients filed daily reports on their mindfulness practice. There was no control group available. Results: Patients with more severe initial symptoms indicated greater amounts of symptom improvement, but did not show great rates of dropout from the MBCT intervention. Younger age was related to higher rates of dropout. Contradictory to some previous data, patients with lower levels of initial trait mindfulness showed greater improvement in symptoms, even after controlling for initial levels of symptoms. Adherence to daily mindfulness practice was high. Consistent with this result, the duration of daily mindfulness practice was not related to immediate outcome. Process studies using multivariate time series analysis revealed a specific role of daily mindfulness in reducing subsequent negative mood. Conclusions: Within the range of patient present in this study and the given study design, results support the use of MBCT in more heterogeneous groups. This demanding intervention was well tolerated by patients with higher levels of symptoms, and resulted in significant improvements regarding residual symptoms. Process-outcome analyses of initial trait mindfulness and daily mindfulness both support the crucial role of changes in mindfulness for the effects of MBCT.