791 resultados para Computing methodologies


Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Breast cancer survivors suffer physical impairment after oncology treatment. This impairment reduces quality of life (QoL) and increase the prevalence of handicaps associated to unhealthy lifestyle (for example, decreased aerobic capacity and strength, weight gain, and fatigue). Recent work has shown that exercise adapted to individual characteristics of patients is related to improved overall and disease-free survival. Nowadays, technological support using telerehabilitation systems is a promising strategy with great advantage of a quick and efficient contact with the health professional. It is not known the role of telerehabilitation through therapeutic exercise as a support tool to implement an active lifestyle which has been shown as an effective resource to improve fitness and reduce musculoskeletal disorders of these women. METHODS / DESIGN This study will use a two-arm, assessor blinded, parallel randomized controlled trial design. People will be eligible if: their diagnosis is of stages I, II, or IIIA breast cancer; they are without chronic disease or orthopedic issues that would interfere with ability to participate in a physical activity program; they had access to the Internet and basic knowledge of computer use or living with a relative who has this knowledge; they had completed adjuvant therapy except for hormone therapy and not have a history of cancer recurrence; and they have an interest in improving lifestyle. Participants will be randomized into e-CUIDATE or usual care groups. E-CUIDATE give participants access to a range of contents: planning exercise arranged in series with breathing exercises, mobility, strength, and stretching. All of these exercises will be assigned to women in the telerehabilitation group according to perceived needs. The control group will be asked to maintain their usual routine. Study endpoints will be assessed after 8 weeks (immediate effects) and after 6 months. The primary outcome will be QoL measured by The European Organization for Research and Treatment of Cancer Quality of Life Questionnaire Core 30 version 3.0 and breast module called The European Organization for Research and Treatment of Cancer Breast Cancer-Specific Quality of Life questionnaire. The secondary outcomes: pain (algometry, Visual Analogue Scale, Brief Pain Inventory short form); body composition; physical measurement (abdominal test, handgrip strength, back muscle strength, and multiple sit-to-stand test); cardiorespiratory fitness (International Fitness Scale, 6-minute walk test, International Physical Activity Questionnaire-Short Form); fatigue (Piper Fatigue Scale and Borg Fatigue Scale); anxiety and depression (Hospital Anxiety and Depression Scale); cognitive function (Trail Making Test and Auditory Consonant Trigram); accelerometry; lymphedema; and anthropometric perimeters. DISCUSSION This study investigates the feasibility and effectiveness of a telerehabilitation system during adjuvant treatment of patients with breast cancer. If this treatment option is effective, telehealth systems could offer a choice of supportive care to cancer patients during the survivorship phase. TRIAL REGISTRATION ClinicalTrials.gov Identifier: NCT01801527.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND: The elderly population is particularly at risk for developing vitamin B12-deficiency. Serum cobalamin does not necessarily reflect a normal B12 status. The determination of methylmalonic acid is not available in all laboratories. Issues of sensitivity for holotranscobalamin and the low specificity of total homocysteine limit their utility. The aim of the present study is to establish a diagnostic algorithm by using a combination of these markers in place of a single measurement. METHODS: We compared the diagnostic efficiency of these markers for detection of vitamin B12 deficiency in a population (n = 218) of institutionalized elderly (median age 80 years). Biochemical, haematological and morphological data were used to categorize people with or without vitamin B12 deficiency. RESULTS: In receiver operating curves characteristics for detection on vitamin B12 deficiency using single measurements, serum folate has the greatest area under the curve (0.87) and homocysteine the lowest (0.67). The best specificity was observed for erythrocyte folate and methylmalonic acid (100% for both) but their sensitivity was very low (17% and 53%, respectively). The highest sensitivity was observed for homocysteine (81%) and serum folate (74%). When we combined these markers, starting with serum and erythrocyte folate, followed by holotranscobalamin and ending by methylmalonic acid measurements, the overall sensitivity and specificity of the algorithm were 100% and 90%, respectively. CONCLUSION: The proposed algorithm, which combines erythrocyte folate, serum folate, holotranscobalamin and methylmalonic acid, but eliminate B12 and tHcy measurements, is a useful alternative for vitamin B12 deficiency screening in an elderly institutionalized cohort.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

We examined the information of legionellosis surveillance system in the municipality of Cordoba (Spain) as a preventive resource for environmental health management and describes the public health implications of sporadic cases investigations. A multidisciplinary team analyzed information on surveillance of water sites potentially contaminated by Legionella and clinical data of the patients classified like community case; mapping patient's home address using geographic information systems (GIS). Legionella pneumophila serogroup 1 was identified in 31 sporadic cases. 53 suspected sources, mainly cooling towers have been investigated. In no event is managed to locate infection source even if poorly maintained systems, structural deficiencies and operational failures were apparent. The use of GIS allowed us to identify two geographic areas where cases are concentrated within one radius of less than 500 meters. The finding of two suspected urban advised to reorient the preventive strategy in public health

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND & AIMS Hy's Law, which states that hepatocellular drug-induced liver injury (DILI) with jaundice indicates a serious reaction, is used widely to determine risk for acute liver failure (ALF). We aimed to optimize the definition of Hy's Law and to develop a model for predicting ALF in patients with DILI. METHODS We collected data from 771 patients with DILI (805 episodes) from the Spanish DILI registry, from April 1994 through August 2012. We analyzed data collected at DILI recognition and at the time of peak levels of alanine aminotransferase (ALT) and total bilirubin (TBL). RESULTS Of the 771 patients with DILI, 32 developed ALF. Hepatocellular injury, female sex, high levels of TBL, and a high ratio of aspartate aminotransferase (AST):ALT were independent risk factors for ALF. We compared 3 ways to use Hy's Law to predict which patients would develop ALF; all included TBL greater than 2-fold the upper limit of normal (×ULN) and either ALT level greater than 3 × ULN, a ratio (R) value (ALT × ULN/alkaline phosphatase × ULN) of 5 or greater, or a new ratio (nR) value (ALT or AST, whichever produced the highest ×ULN/ alkaline phosphatase × ULN value) of 5 or greater. At recognition of DILI, the R- and nR-based models identified patients who developed ALF with 67% and 63% specificity, respectively, whereas use of only ALT level identified them with 44% specificity. However, the level of ALT and the nR model each identified patients who developed ALF with 90% sensitivity, whereas the R criteria identified them with 83% sensitivity. An equal number of patients who did and did not develop ALF had alkaline phosphatase levels greater than 2 × ULN. An algorithm based on AST level greater than 17.3 × ULN, TBL greater than 6.6 × ULN, and AST:ALT greater than 1.5 identified patients who developed ALF with 82% specificity and 80% sensitivity. CONCLUSIONS When applied at DILI recognition, the nR criteria for Hy's Law provides the best balance of sensitivity and specificity whereas our new composite algorithm provides additional specificity in predicting the ultimate development of ALF.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

In 2006 the Library of the Andalusian Public Health System (BVSSPA) is constituted as a Virtual Library which provides Resources and Services that are accessed through inter-hospital local area network (corporate intranet) and Internet. On the other hand, the Hospital de la Axarquia still did not have any institutional or self-presence on the Internet and the librarian ask the need to create a space for communication with the "digital users" of the Library of the area through a website. MATERIALS AND METHODS. The reasons why we opted for a blog were: -It was necessary to make no financial outlay for its establishment. It allowed for great versatility, both in its administration and in its management by users. -The ability to compile on the same platform different Web 2.0 communication tools. Between different options available we chose Blogger Google Inc. The blog allowed entry to the 2.0 services or Social Web in the library. The benefits offered were many, especially the visibility of the service and communication with the user. 2.0 tools that have been incorporated into the library are: content syndication (RSS) which allowed users to stay informed about updates to the blog. Share documents and other multimedia as presentations through SlideShare, images through Flickr or Picasa, or videos (YouTube). And the presence on social network like Facebook and Twitter. RESULTS. The analysis of the activity we has been traking by Google Analytics tool, helping to determine the number of blog visits. Since its stablishment, on November 17th 2006, until November 29th 2010 the blog has received 15,787 visitors, 38,422 page views were consulted, at each visit on average 2.4 pages were consulted and each visit has an average stay at the site of 4'31''. DISCUSSION. The blog has served as a communication and information tool with the user. Since the creation of the blog we have incorporated technologies and tools to interact with the user. With all the tools used we have applied the concept of "open source" and the contents were generated from the activities organized in the Knowledge Management Unit from the anatomo-clinical sessions, the training activities, dissemination events, etc. The result has been the customization of library services, contextualized in the Knowledge Management Unit - Axarquia. In social networks we have shared information and files with the professionals and the community. CONCLUSIONS. The blog has allowed us to explore technologies that allow us to communicate with the user and the community, disseminate information and documents with the participation of users and become the "Interactive Library" we aspire to be.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Clinical Decision Support Systems (CDSS) are software applications that support clinicians in making healthcare decisions providing relevant information for individual patients about their specific conditions. The lack of integration between CDSS and Electronic Health Record (EHR) has been identified as a significant barrier to CDSS development and adoption. Andalusia Healthcare Public System (AHPS) provides an interoperable health information infrastructure based on a Service Oriented Architecture (SOA) that eases CDSS implementation. This paper details the deployment of a CDSS jointly with the deployment of a Terminology Server (TS) within the AHPS infrastructure. It also explains a case study about the application of decision support to thromboembolism patients and its potential impact on improving patient safety. We will apply the inSPECt tool proposal to evaluate the appropriateness of alerts in this scenario.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Introduction: Our goal was to know the web contents and examine the technical information pest control services available to users through their webpages. Method: A total of 70 webpages from biocides services in the province of Málaga (Spain) were analyzed. We used 15 evaluation indicators grouped into 5 parameters relating to data of the service provider; information’s reliability and services; accuracy of content and writing style; technical resources and interaction with the users. As test instruments were used sectoral legislation, official records of products and deliveries, standards and technical guides. Results: Companies showed a remarkable degree of awareness with the implementation and use of new technologies. Aspects negative that they can have an impact on the confidence of users, relating to the reliability of the information and deficiencies associated with the description of the services portfolio and credentials of the companies were identified. The integration and use of collaborative platforms 2.0 was poorly developed and squandered. Discussion: It is possible to improve the trust of users intervening in those aspects that affect the reliability of the information provided on the web.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

BACKGROUND Skin patch test is the gold standard method in diagnosing contact allergy. Although used for more than 100 years, the patch test procedure is performed with variability around the world. A number of factors can influence the test results, namely the quality of reagents used, the timing of the application, the patch test series (allergens/haptens) that have been used for testing, the appropriate interpretation of the skin reactions or the evaluation of the patient's benefit. METHODS We performed an Internet -based survey with 38 questions covering the educational background of respondents, patch test methods and interpretation. The questionnaire was distributed among all representatives of national member societies of the World Allergy Organization (WAO), and the WAO Junior Members Group. RESULTS One hundred sixty-nine completed surveys were received from 47 countries. The majority of participants had more than 5 years of clinical practice (61 %) and routinely carried out patch tests (70 %). Both allergists and dermatologists were responsible for carrying out the patch tests. We could observe the use of many different guidelines regardless the geographical distribution. The use of home-made preparations was indicated by 47 % of participants and 73 % of the respondents performed 2 or 3 readings. Most of the responders indicated having patients with adverse reactions, including erythroderma (12 %); however, only 30 % of members completed a consent form before conducting the patch test. DISCUSSION The heterogeneity of patch test practices may be influenced by the level of awareness of clinical guidelines, different training backgrounds, accessibility to various types of devices, the patch test series (allergens/haptens) used for testing, type of clinical practice (public or private practice, clinical or research-based institution), infrastructure availability, financial/commercial implications and regulations among others. CONCLUSION There is a lack of a worldwide homogeneity of patch test procedures, and this raises concerns about the need for standardization and harmonization of this important diagnostic procedure.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Die zunehmende Vernetzung der Informations- und Kommunikationssysteme führt zu einer weiteren Erhöhung der Komplexität und damit auch zu einer weiteren Zunahme von Sicherheitslücken. Klassische Schutzmechanismen wie Firewall-Systeme und Anti-Malware-Lösungen bieten schon lange keinen Schutz mehr vor Eindringversuchen in IT-Infrastrukturen. Als ein sehr wirkungsvolles Instrument zum Schutz gegenüber Cyber-Attacken haben sich hierbei die Intrusion Detection Systeme (IDS) etabliert. Solche Systeme sammeln und analysieren Informationen von Netzwerkkomponenten und Rechnern, um ungewöhnliches Verhalten und Sicherheitsverletzungen automatisiert festzustellen. Während signatur-basierte Ansätze nur bereits bekannte Angriffsmuster detektieren können, sind anomalie-basierte IDS auch in der Lage, neue bisher unbekannte Angriffe (Zero-Day-Attacks) frühzeitig zu erkennen. Das Kernproblem von Intrusion Detection Systeme besteht jedoch in der optimalen Verarbeitung der gewaltigen Netzdaten und der Entwicklung eines in Echtzeit arbeitenden adaptiven Erkennungsmodells. Um diese Herausforderungen lösen zu können, stellt diese Dissertation ein Framework bereit, das aus zwei Hauptteilen besteht. Der erste Teil, OptiFilter genannt, verwendet ein dynamisches "Queuing Concept", um die zahlreich anfallenden Netzdaten weiter zu verarbeiten, baut fortlaufend Netzverbindungen auf, und exportiert strukturierte Input-Daten für das IDS. Den zweiten Teil stellt ein adaptiver Klassifikator dar, der ein Klassifikator-Modell basierend auf "Enhanced Growing Hierarchical Self Organizing Map" (EGHSOM), ein Modell für Netzwerk Normalzustand (NNB) und ein "Update Model" umfasst. In dem OptiFilter werden Tcpdump und SNMP traps benutzt, um die Netzwerkpakete und Hostereignisse fortlaufend zu aggregieren. Diese aggregierten Netzwerkpackete und Hostereignisse werden weiter analysiert und in Verbindungsvektoren umgewandelt. Zur Verbesserung der Erkennungsrate des adaptiven Klassifikators wird das künstliche neuronale Netz GHSOM intensiv untersucht und wesentlich weiterentwickelt. In dieser Dissertation werden unterschiedliche Ansätze vorgeschlagen und diskutiert. So wird eine classification-confidence margin threshold definiert, um die unbekannten bösartigen Verbindungen aufzudecken, die Stabilität der Wachstumstopologie durch neuartige Ansätze für die Initialisierung der Gewichtvektoren und durch die Stärkung der Winner Neuronen erhöht, und ein selbst-adaptives Verfahren eingeführt, um das Modell ständig aktualisieren zu können. Darüber hinaus besteht die Hauptaufgabe des NNB-Modells in der weiteren Untersuchung der erkannten unbekannten Verbindungen von der EGHSOM und der Überprüfung, ob sie normal sind. Jedoch, ändern sich die Netzverkehrsdaten wegen des Concept drif Phänomens ständig, was in Echtzeit zur Erzeugung nicht stationärer Netzdaten führt. Dieses Phänomen wird von dem Update-Modell besser kontrolliert. Das EGHSOM-Modell kann die neuen Anomalien effektiv erkennen und das NNB-Model passt die Änderungen in Netzdaten optimal an. Bei den experimentellen Untersuchungen hat das Framework erfolgversprechende Ergebnisse gezeigt. Im ersten Experiment wurde das Framework in Offline-Betriebsmodus evaluiert. Der OptiFilter wurde mit offline-, synthetischen- und realistischen Daten ausgewertet. Der adaptive Klassifikator wurde mit dem 10-Fold Cross Validation Verfahren evaluiert, um dessen Genauigkeit abzuschätzen. Im zweiten Experiment wurde das Framework auf einer 1 bis 10 GB Netzwerkstrecke installiert und im Online-Betriebsmodus in Echtzeit ausgewertet. Der OptiFilter hat erfolgreich die gewaltige Menge von Netzdaten in die strukturierten Verbindungsvektoren umgewandelt und der adaptive Klassifikator hat sie präzise klassifiziert. Die Vergleichsstudie zwischen dem entwickelten Framework und anderen bekannten IDS-Ansätzen zeigt, dass der vorgeschlagene IDSFramework alle anderen Ansätze übertrifft. Dies lässt sich auf folgende Kernpunkte zurückführen: Bearbeitung der gesammelten Netzdaten, Erreichung der besten Performanz (wie die Gesamtgenauigkeit), Detektieren unbekannter Verbindungen und Entwicklung des in Echtzeit arbeitenden Erkennungsmodells von Eindringversuchen.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Conselho Nacional de Desenvolvimento Científico e Tecnológico (CNPq)

Relevância:

60.00% 60.00%

Publicador:

Resumo:

To be competitive in contemporary turbulent environments, firms must be capable of processing huge amounts of information, and effectively convert it into actionable knowledge. This is particularly the case in the marketing context, where problems are also usually highly complex, unstructured and ill-defined. In recent years, the development of marketing management support systems has paralleled this evolution in informational problems faced by managers, leading to a growth in the study (and use) of artificial intelligence and soft computing methodologies. Here, we present and implement a novel intelligent system that incorporates fuzzy logic and genetic algorithms to operate in an unsupervised manner. This approach allows the discovery of interesting association rules, which can be linguistically interpreted, in large scale databases (KDD or Knowledge Discovery in Databases.) We then demonstrate its application to a distribution channel problem. It is shown how the proposed system is able to return a number of novel and potentially-interesting associations among variables. Thus, it is argued that our method has significant potential to improve the analysis of marketing and business databases in practice, especially in non-programmed decisional scenarios, as well as to assist scholarly researchers in their exploratory analysis. © 2013 Elsevier Inc.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Every year production volume of castings grows, especially grows production volume of non-ferrous metals, thanks to aluminium. As a result, requirements to castings quality also increase. Foundry men from all over the world put all their efforts to manage the problem of casting defects. In this article the authors present an approach based on the use of cognitive models that help to visualize inner cause-and-effect relations leading to casting defects in the foundry process. The cognitive models mentioned comprise a diverse network of factors and their relations, which together thoroughly describe all the details of the foundry process and their influence on the appearance of castings’ defects and other aspects.. Moreover, the article contains an example of a simple die casting model and results of simulation. Implementation of the proposed method will help foundry men reveal the mechanism and the main reasons of casting defects formation.

Relevância:

60.00% 60.00%

Publicador:

Resumo:

International audience

Relevância:

60.00% 60.00%

Publicador:

Resumo:

Dissertação de dout. em Electrónica e Computação, Faculdade de Ciências e Tecnologia, Univ. do Algarve, 2004

Relevância:

30.00% 30.00%

Publicador:

Resumo:

In the following text I will develop three major aspects. The first is to draw attention to those who seem to have been the disciplinary fields where, despite everything, the Digital Humanities (in the broad perspective as will be regarded here) have asserted themselves in a more comprehensive manner. I think it is here that I run into greater risks, not only for what I have mentioned above, but certainly because a significant part, perhaps, of the achievements and of the researchers might have escaped the look that I sought to cast upon the past few decades, always influenced by my own experience and the work carried out in the field of History. But this can be considered as a work in progress and it is open to criticism and suggestions. A second point to note is that emphasis will be given to the main lines of development in the relationship between historical research and digital methodologies, resources and tools. Finally, I will try to make a brief analysis of what has been the Digital Humanities discourse appropriation in recent years, with very debatable data and methods for sure, because studies are still scarce and little systematic information is available that would allow to go beyond an introductory reflection.