945 resultados para Site na internet, pesquisa


Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Internet-based surveillance systems provide a novel approach to monitoring infectious diseases. Surveillance systems built on internet data are economically, logistically and epidemiologically appealing and have shown significant promise. The potential for these systems has increased with increased internet availability and shifts in health-related information seeking behaviour. This approach to monitoring infectious diseases has, however, only been applied to single or small groups of select diseases. This study aims to systematically investigate the potential for developing surveillance and early warning systems using internet search data, for a wide range of infectious diseases. Methods Official notifications for 64 infectious diseases in Australia were downloaded and correlated with frequencies for 164 internet search terms for the period 2009–13 using Spearman’s rank correlations. Time series cross correlations were performed to assess the potential for search terms to be used in construction of early warning systems. Results Notifications for 17 infectious diseases (26.6%) were found to be significantly correlated with a selected search term. The use of internet metrics as a means of surveillance has not previously been described for 12 (70.6%) of these diseases. The majority of diseases identified were vaccine-preventable, vector-borne or sexually transmissible; cross correlations, however, indicated that vector-borne and vaccine preventable diseases are best suited for development of early warning systems. Conclusions The findings of this study suggest that internet-based surveillance systems have broader applicability to monitoring infectious diseases than has previously been recognised. Furthermore, internet-based surveillance systems have a potential role in forecasting emerging infectious disease events, especially for vaccine-preventable and vector-borne diseases

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Traditionally the notion of drawing in-situ has suggested the physical presence of the artist in the environment under scrutiny. The assumption here of enhanced connectivity, however, is hasty in light of the idea that situation implies a relative spatial value determined by the interplay of subject and location, where the possibility of not being “in-situ” is problematic. The fact that traditional drawing in-situ, such as the rendering of landscape, requires a framing of the world “out there” suggests a distance between the perceived object of representation and the drawing surface. Rather than suggesting that some drawing is situated and other sorts of drawing are not, however, I argue that situation or site is variously extended and intensified depending on the nature of mediation between surface and environment. The suggestion here is that site is not so much a precondition as a performative function, developed in the act of drawing and always implicating the drawing surface. In my discussion I focus on specific works by Toba Khedoori and Cameron Robbins. As well, in using my own recent drawing practice as a case study, I argue that the geography of site is delimited neither by horizon nor the boundaries of the paper. Rather, I propose that site and drawing surface coincide in variously intensive and extensive ways.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ways in which technology mediates daily activities is shifting rapidly. Global trends point toward the uptake of ambient and interactive media to create radical new ways of working, interacting and socialising. Tech giants such as Google and Apple are banking on the success of this emerging market by investing in new future focused consumer products such as Google Glass and the Apple Watch. The potential implications of ubiquitous technological interactions via tangible and ambient media have never been more real or more accessible.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This submission is directed to issues arising in respect of the need to recognise and support access to the internet for all Australian residents and citizens. As such it addresses the following questions only: Questions 2-1: What general principles or criteria should be applied to help determine whether a law that interferes with freedom of speech is justified? Question 2-2: Which Commonwealth laws unjustifiably interfere with freedom of speech, and why are these laws unjustified?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Have you ever wished you were Doctor Who and could pop yourself and your students into a Tardis and teleport them to an historical event or to meet a historical figure? We all know that unfortunately time travel is not (yet) possible, but maybe student and teacher teleportation just might be – sort of. Over the past few centuries and in lieu of time travel our communities have developed museums as a means of experiencing some of our history...

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Background Surgical site infections (SSIs) are wound infections that occur after invasive (surgical) procedures. Preoperative bathing or showering with an antiseptic skin wash product is a well-accepted procedure for reducing skin bacteria (microflora). It is less clear whether reducing skin microflora leads to a lower incidence of surgical site infection. Objectives To review the evidence for preoperative bathing or showering with antiseptics for preventing hospital-acquired (nosocomial) surgical site infections. Search methods For this fifth update we searched the Cochrane Wounds Group Specialised Register (searched 18 December 2014); the Cochrane Central Register of Controlled Trials (The Cochrane Library 2014 Issue 11); Ovid MEDLINE (2012 to December Week 4 2014), Ovid MEDLINE (In-Process & Other Non-Indexed Citations December 18, 2014); Ovid EMBASE (2012 to 2014 Week 51), EBSCO CINAHL (2012 to December 18 2014) and reference lists of articles. Selection criteria Randomised controlled trials comparing any antiseptic preparation used for preoperative full-body bathing or showering with non-antiseptic preparations in people undergoing surgery. Data collection and analysis Two review authors independently assessed studies for selection, risk of bias and extracted data. Study authors were contacted for additional information. Main results We did not identify any new trials for inclusion in this fifth update. Seven trials involving a total of 10,157 participants were included. Four of the included trials had three comparison groups. The antiseptic used in all trials was 4% chlorhexidine gluconate (Hibiscrub/Riohex). Three trials involving 7791 participants compared chlorhexidine with a placebo. Bathing with chlorhexidine compared with placebo did not result in a statistically significant reduction in SSIs; the relative risk of SSI (RR) was 0.91 (95% confidence interval (CI) 0.80 to 1.04). When only trials of high quality were included in this comparison, the RR of SSI was 0.95 (95%CI 0.82 to 1.10). Three trials of 1443 participants compared bar soap with chlorhexidine; when combined there was no difference in the risk of SSIs (RR 1.02, 95% CI 0.57 to 1.84). Three trials of 1192 patients compared bathing with chlorhexidine with no washing, one large study found a statistically significant difference in favour of bathing with chlorhexidine (RR 0.36, 95%CI 0.17 to 0.79). The smaller studies found no difference between patients who washed with chlorhexidine and those who did not wash preoperatively. Authors' conclusions This review provides no clear evidence of benefit for preoperative showering or bathing with chlorhexidine over other wash products, to reduce surgical site infection. Efforts to reduce the incidence of nosocomial surgical site infection should focus on interventions where effect has been demonstrated.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

CoMFA and CoMSIA analysis were utilized in this investigation to define the important interacting regions in paclitaxel/tubulin binding site and to develop selective paclitaxel-like active compounds. The starting geometry of paclitaxel analogs was taken from the crystal structure of docetaxel. A total of 28 derivatives of paclitaxel were divided into two groups—a training set comprising of 19 compounds and a test set comprising of nine compounds. They were constructed and geometrically optimized using SYBYL v6.6. CoMFA studies provided a good predictability (q2 = 0.699, r2 = 0.991, PC = 6, S.E.E. = 0.343 and F = 185.910). They showed the steric and electrostatic properties as the major interacting forces whilst the lipophilic property contribution was a minor factor for recognition forces of the binding site. These results were in agreement with the experimental data of the binding activities of these compounds. Five fields in CoMSIA analysis (steric, electrostatic, hydrophobic, hydrogen-bond acceptor and donor properties) were considered contributors in the ligand–receptor interactions. The results obtained from the CoMSIA studies were: q2 = 0.535, r2 = 0.983, PC = 5, S.E.E. = 0.452 and F = 127.884. The data obtained from both CoMFA and CoMSIA studies were interpreted with respect to the paclitaxel/tubulin binding site. This intuitively suggested where the most significant anchoring points for binding affinity are located. This information could be used for the development of new compounds having paclitaxel-like activity with new chemical entities to overcome the existing pharmaceutical barriers and the economical problem associated with the synthesis of the paclitaxel analogs. These will boost the wide use of this useful class of compounds, i.e. in brain tumors as the most of the present active compounds have poor blood–brain barrier crossing ratios and also, various tubulin isotypes has shown resistance to taxanes and other antimitotic agents.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Meta-analyses estimate a statistical effect size for a test or an analysis by combining results from multiple studies without necessarily having access to each individual study's raw data. Multi-site meta-analysis is crucial for imaging genetics, as single sites rarely have a sample size large enough to pick up effects of single genetic variants associated with brain measures. However, if raw data can be shared, combining data in a "mega-analysis" is thought to improve power and precision in estimating global effects. As part of an ENIGMA-DTI investigation, we use fractional anisotropy (FA) maps from 5 studies (total N=2, 203 subjects, aged 9-85) to estimate heritability. We combine the studies through meta-and mega-analyses as well as a mixture of the two - combining some cohorts with mega-analysis and meta-analyzing the results with those of the remaining sites. A combination of mega-and meta-approaches may boost power compared to meta-analysis alone.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The ENIGMA (Enhancing NeuroImaging Genetics through Meta-Analysis) Consortium was set up to analyze brain measures and genotypes from multiple sites across the world to improve the power to detect genetic variants that influence the brain. Diffusion tensor imaging (DTI) yields quantitative measures sensitive to brain development and degeneration, and some common genetic variants may be associated with white matter integrity or connectivity. DTI measures, such as the fractional anisotropy (FA) of water diffusion, may be useful for identifying genetic variants that influence brain microstructure. However, genome-wide association studies (GWAS) require large populations to obtain sufficient power to detect and replicate significant effects, motivating a multi-site consortium effort. As part of an ENIGMA-DTI working group, we analyzed high-resolution FA images from multiple imaging sites across North America, Australia, and Europe, to address the challenge of harmonizing imaging data collected at multiple sites. Four hundred images of healthy adults aged 18-85 from four sites were used to create a template and corresponding skeletonized FA image as a common reference space. Using twin and pedigree samples of different ethnicities, we used our common template to evaluate the heritability of tract-derived FA measures. We show that our template is reliable for integrating multiple datasets by combining results through meta-analysis and unifying the data through exploratory mega-analyses. Our results may help prioritize regions of the FA map that are consistently influenced by additive genetic factors for future genetic discovery studies. Protocols and templates are publicly available at (http://enigma.loni.ucla.edu/ongoing/dti-working-group/).

Relevância:

20.00% 20.00%

Publicador:

Resumo:

Combining datasets across independent studies can boost statistical power by increasing the numbers of observations and can achieve more accurate estimates of effect sizes. This is especially important for genetic studies where a large number of observations are required to obtain sufficient power to detect and replicate genetic effects. There is a need to develop and evaluate methods for joint-analytical analyses of rich datasets collected in imaging genetics studies. The ENIGMA-DTI consortium is developing and evaluating approaches for obtaining pooled estimates of heritability through meta-and mega-genetic analytical approaches, to estimate the general additive genetic contributions to the intersubject variance in fractional anisotropy (FA) measured from diffusion tensor imaging (DTI). We used the ENIGMA-DTI data harmonization protocol for uniform processing of DTI data from multiple sites. We evaluated this protocol in five family-based cohorts providing data from a total of 2248 children and adults (ages: 9-85) collected with various imaging protocols. We used the imaging genetics analysis tool, SOLAR-Eclipse, to combine twin and family data from Dutch, Australian and Mexican-American cohorts into one large "mega-family". We showed that heritability estimates may vary from one cohort to another. We used two meta-analytical (the sample-size and standard-error weighted) approaches and a mega-genetic analysis to calculate heritability estimates across-population. We performed leave-one-out analysis of the joint estimates of heritability, removing a different cohort each time to understand the estimate variability. Overall, meta- and mega-genetic analyses of heritability produced robust estimates of heritability.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

This paper examines the dispute between the Seattle company Virtual Countries Inc. and the Republic of South Africa over the ownership of the domain name address southafrica.com. The first part of the paper deals with the pre-emptive litigation taken by Virtual Countries Inc. in a District Court of the United States. The second part considers the possible arbitration of the dispute under the Uniform Domain Name Dispute Resolution Process of the Internet Corporation for Assigned Names and Numbers (ICANN) and examines the wider implications of this dispute for the jurisdiction and the governance of ICANN. The final section of the paper evaluates the Final Report of the Second WIPO Internet Domain Name Process.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.

Relevância:

20.00% 20.00%

Publicador:

Resumo:

What role does Australia play in debates over the regulation and governance of the Internet? Is it a hub? A node in the information grid? Or is it a mere cul–de–sac? Or are we mere road–kill, bush junk, on the information autobahn?

Relevância:

20.00% 20.00%

Publicador:

Resumo:

The Trans-Pacific Partnership (TPP) is a highly secretive trade agreement being negotiated between the US and eleven Pacific Rim countries, including Australia. Having obtained a fast-track authority from the United States Congress, US President Barack Obama is keen to finalise the deal. However, he was unable to achieve a resolution of the deal at recent talks in Hawaii on the TPP. A number of chapters of the TPP will affect the creative artists, cultural industries and internet freedom — including the intellectual property chapter, the investment chapter, and the electronic commerce chapter. Legacy copyright industries have pushed for longer and stronger copyright protection throughout the Pacific Rim. In the wake of the Hawaii talks, Knowledge Ecology International leaked the latest version of the intellectual property chapter of the TPP. Jamie Love of Knowledge Ecology International commented upon the leaked text about copyright law: ‘In many sections of the text, the TPP would change global norms, restrict access to knowledge, create significant financial risks for persons using and sharing information, and, in some cases, impose new costs on persons producing new knowledge goods.’ The recent leaked text reveals a philosophical debate about the nature of intellectual property law. There are mixed messages in respect of the treatment of the public domain under copyright law. In one part of the agreement on internet service providers, there is text that says that the parties recognise the need for ‘promoting innovation and creativity,’ ‘facilitating the diffusion of information, knowledge, technology, culture, and the arts’, and ‘foster competition and open and efficient markets.’ A number of countries suggested ‘acknowledging the importance of the public domain.’ The United States and Japan opposed the recognition of the public domain in this text.