509 resultados para anonimato rete privacy deep web onion routing cookie
Resumo:
Information security and privacy in the healthcare domain is a complex and challenging problem for computer scientists, social scientists, law experts and policy makers. Appropriate healthcare provision requires specialized knowledge, is information intensive and much patient information is of a particularly sensitive nature. Electronic health record systems provide opportunities for information sharing which may enhance healthcare services, for both individuals and populations. However, appropriate information management measures are essential for privacy preservation...
Resumo:
Dwellings in multi-storey apartment buildings (MSAB) are predicted to increase dramatically as a proportion of housing stock in subtropical cities over coming decades. The problem of designing comfortable and healthy high-density residential environments and minimising energy consumption must be addressed urgently in subtropical cities globally. This paper explores private residents’ experiences of privacy and comfort and their perceptions of how well their apartment dwelling modulated the external environment in subtropical conditions through analysis of 636 survey responses and 24 interviews with residents of MSAB in inner urban neighbourhoods of Brisbane, Australia. The findings show that the availability of natural ventilation and outdoor private living spaces play important roles in resident perceptions of liveability in the subtropics where the climate is conducive to year round “outdoor living”. Residents valued choice with regard to climate control methods in their apartments. They overwhelmingly preferred natural ventilation to manage thermal comfort, and turned to the air-conditioner for limited periods, particularly when external conditions were too noisy. These findings provide a unique evidence base for reducing the environmental impact of MSAB and increasing the acceptability of apartment living, through incorporating residential attributes positioned around climate-responsive architecture.
Resumo:
Motivation Extracellular vesicles (EVs) are spherical bilayered proteolipids, harboring various bioactive molecules. Due to the complexity of the vesicular nomenclatures and components, online searches for EV-related publications and vesicular components are currently challenging. Results We present an improved version of EVpedia, a public database for EVs research. This community web portal contains a database of publications and vesicular components, identification of orthologous vesicular components, bioinformatic tools and a personalized function. EVpedia includes 6879 publications, 172 080 vesicular components from 263 high-throughput datasets, and has been accessed more than 65 000 times from more than 750 cities. In addition, about 350 members from 73 international research groups have participated in developing EVpedia. This free web-based database might serve as a useful resource to stimulate the emerging field of EV research. Availability and implementation The web site was implemented in PHP, Java, MySQL and Apache, and is freely available at http://evpedia.info.
Resumo:
Currently we are facing an overburdening growth of the number of reliable information sources on the Internet. The quantity of information available to everyone via Internet is dramatically growing each year [15]. At the same time, temporal and cognitive resources of human users are not changing, therefore causing a phenomenon of information overload. World Wide Web is one of the main sources of information for decision makers (reference to my research). However our studies show that, at least in Poland, the decision makers see some important problems when turning to Internet as a source of decision information. One of the most common obstacles raised is distribution of relevant information among many sources, and therefore need to visit different Web sources in order to collect all important content and analyze it. A few research groups have recently turned to the problem of information extraction from the Web [13]. The most effort so far has been directed toward collecting data from dispersed databases accessible via web pages (related to as data extraction or information extraction from the Web) and towards understanding natural language texts by means of fact, entity, and association recognition (related to as information extraction). Data extraction efforts show some interesting results, however proper integration of web databases is still beyond us. Information extraction field has been recently very successful in retrieving information from natural language texts, however it is still lacking abilities to understand more complex information, requiring use of common sense knowledge, discourse analysis and disambiguation techniques.
Resumo:
Web service and business process technologies are widely adopted to facilitate business automation and collaboration. Given the complexity of business processes, it is a sought-after feature to show a business process with different views to cater for the diverse interests, authority levels, etc., of different users. Aiming to implement such flexible process views in the Web service environment, this paper presents a novel framework named FlexView to support view abstraction and concretisation of WS-BPEL processes. In the FlexView framework, a rigorous view model is proposed to specify the dependency and correlation between structural components of process views with emphasis on the characteristics of WS-BPEL, and a set of rules are defined to guarantee the structural consistency between process views during transformations. A set of algorithms are developed to shift the abstraction and concretisation operations to the operational level. A prototype is also implemented for the proof-of-concept purpose. © 2010 Springer Science+Business Media, LLC.
Resumo:
We present an empirical evaluation and comparison of two content extraction methods in HTML: absolute XPath expressions and relative XPath expressions. We argue that the relative XPath expressions, although not widely used, should be used in preference to absolute XPath expressions in extracting content from human-created Web documents. Evaluation of robustness covers four thousand queries executed on several hundred webpages. We show that in referencing parts of real world dynamic HTML documents, relative XPath expressions are on average significantly more robust than absolute XPath ones.
Resumo:
The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.
Resumo:
Recent research on hollow flange beams has led to the development of an innovative rectangular hollow flange channel beam (RHFCB) for use in floor systems. The new RHFCB is a mono-symmetric structural section made by intermittently rivet fastening two torsionally rigid closed rectangular hollow flanges to a web plate element, which allows section optimisation by selecting appropriate combinations of web and flange widths and thicknesses. However, the current design rules for cold-formed steel sections are not directly applicable to rivet fastened RHFCBs. To date, no investigation has been conducted on their web crippling behaviour and strengths. Hence an experimental study was conducted to investigate the web crippling behaviour and capacities of rivet fastened RHFCBs under End Two Flange (ETF) and Interior Two Flange (ITF) load cases. It showed that RHFCBs failed by web crippling, flange crushing and their combinations. Comparison of ultimate web crippling capacities with the predictions from the design equations in AS/NZS 4600 and AISI S100 showed that the current design equations are unconservative for rivet fastened RHFCB sections under ETF and ITF load cases. Hence new equations were proposed to determine the web crippling capacities of rivet fastened RHFCBs. These equations can also be used to predict the capacities of RHFCBs subject to combined web crippling and flange crushing conservatively. However, new capacity equations were proposed in the case of flange crushing failures that occurred in thinner flanges with smaller bearing lengths. This paper presents the details of this web crippling experimental study of RHFCB sections and the results.
Resumo:
Purpose Following the perspective of frustration theory customer frustration incidents lead to frustration behavior such as protest (negative word‐of‐mouth). On the internet customers can express their emotions verbally and non‐verbally in numerous web‐based review platforms. The purpose of this study is to investigate online dysfunctional customer behavior, in particular negative “word‐of‐web” (WOW) in online feedback forums, among customers who participate in frequent‐flier programs in the airline industry. Design/methodology/approach The study employs a variation of the critical incident technique (CIT) referred to as the critical internet feedback technique (CIFT). Qualitative data of customer reviews of 13 different frequent‐flier programs posted on the internet were collected and analyzed with regard to frustration incidents, verbal and non‐verbal emotional effects and types of dysfunctional word‐of‐web customer behavior. The sample includes 141 negative customer reviews based on non‐recommendations and low program ratings. Findings Problems with loyalty programs evoke negative emotions that are expressed in a spectrum of verbal and non‐verbal negative electronic word‐of‐mouth. Online dysfunctional behavior can vary widely from low ratings and non‐recommendations to voicing switching intentions to even stronger forms such as manipulation of others and revenge intentions. Research limitations/implications Results have to be viewed carefully due to methodological challenges with regard to the measurement of emotions, in particular the accuracy of self‐report techniques and the quality of online data. Generalization of the results is limited because the study utilizes data from only one industry. Further research is needed with regard to the exact differentiation of frustration from related constructs. In addition, large‐scale quantitative studies are necessary to specify and test the relationships between frustration incidents and subsequent dysfunctional customer behavior expressed in negative word‐of‐web. Practical implications The study yields important implications for the monitoring of the perceived quality of loyalty programs. Management can obtain valuable information about program‐related and/or relationship‐related frustration incidents that lead to online dysfunctional customer behavior. A proactive response strategy should be developed to deal with severe cases, such as sabotage plans. Originality/value This study contributes to knowledge regarding the limited research of online dysfunctional customer behavior as well as frustration incidents of loyalty programs. Also, the article presents a theoretical “customer frustration‐defection” framework that describes different levels of online dysfunctional behavior in relation to the level of frustration sensation that customers have experienced. The framework extends the existing perspective of the “customer satisfaction‐loyalty” framework developed by Heskett et al.
Resumo:
Lattice-based cryptographic primitives are believed to offer resilience against attacks by quantum computers. We demonstrate the practicality of post-quantum key exchange by constructing cipher suites for the Transport Layer Security (TLS) protocol that provide key exchange based on the ring learning with errors (R-LWE) problem, we accompany these cipher suites with a rigorous proof of security. Our approach ties lattice-based key exchange together with traditional authentication using RSA or elliptic curve digital signatures: the post-quantum key exchange provides forward secrecy against future quantum attackers, while authentication can be provided using RSA keys that are issued by today's commercial certificate authorities, smoothing the path to adoption. Our cryptographically secure implementation, aimed at the 128-bit security level, reveals that the performance price when switching from non-quantum-safe key exchange is not too high. With our R-LWE cipher suites integrated into the Open SSL library and using the Apache web server on a 2-core desktop computer, we could serve 506 RLWE-ECDSA-AES128-GCM-SHA256 HTTPS connections per second for a 10 KiB payload. Compared to elliptic curve Diffie-Hellman, this means an 8 KiB increased handshake size and a reduction in throughput of only 21%. This demonstrates that provably secure post-quantum key-exchange can already be considered practical.
Resumo:
The book, New Dimensions in Privacy Law, has an arresting cover — a pack of paparazzi take photographs, with their flash-bulbs popping and exploding,like starbursts in the sky. The collection explores the valiant efforts of courts and parliaments to defend the privacy of individuals against such unwanted intrusions.
Resumo:
In his 1987 book, The Media Lab: Inventing the Future at MIT, Stewart Brand provides an insight into the visions of the future of the media in the 1970s and 1980s. 1 He notes that Nicolas Negroponte made a compelling case for the foundation of a media laboratory at MIT with diagrams detailing the convergence of three sectors of the media—the broadcast and motion picture industry; the print and publishing industry; and the computer industry. Stewart Brand commented: ‘If Negroponte was right and communications technologies really are converging, you would look for signs that technological homogenisation was dissolving old boundaries out of existence, and you would expect an explosion of new media where those boundaries used to be’. Two decades later, technology developers, media analysts and lawyers have become excited about the latest phase of media convergence. In 2006, the faddish Time Magazine heralded the arrival of various Web 2.0 social networking services: You can learn more about how Americans live just by looking at the backgrounds of YouTube videos—those rumpled bedrooms and toy‐strewn basement rec rooms—than you could from 1,000 hours of network television. And we didn’t just watch, we also worked. Like crazy. We made Facebook profiles and Second Life avatars and reviewed books at Amazon and recorded podcasts. We blogged about our candidates losing and wrote songs about getting dumped. We camcordered bombing runs and built open‐source software. America loves its solitary geniuses—its Einsteins, its Edisons, its Jobses—but those lonely dreamers may have to learn to play with others. Car companies are running open design contests. Reuters is carrying blog postings alongside its regular news feed. Microsoft is working overtime to fend off user‐created Linux. We’re looking at an explosion of productivity and innovation, and it’s just getting started, as millions of minds that would otherwise have drowned in obscurity get backhauled into the global intellectual economy. The magazine announced that Time’s Person of the Year was ‘You’, the everyman and everywoman consumer ‘for seizing the reins of the global media, for founding and framing the new digital democracy, for working for nothing and beating the pros at their own game’. This review essay considers three recent books, which have explored the legal dimensions of new media. In contrast to the unbridled exuberance of Time Magazine, this series of legal works displays an anxious trepidation about the legal ramifications associated with the rise of social networking services. In his tour de force, The Future of Reputation: Gossip, Rumor, and Privacy on the Internet, Daniel Solove considers the implications of social networking services, such as Facebook and YouTube, for the legal protection of reputation under privacy law and defamation law. Andrew Kenyon’s edited collection, TV Futures: Digital Television Policy in Australia, explores the intersection between media law and copyright law in the regulation of digital television and Internet videos. In The Future of the Internet and How to Stop It, Jonathan Zittrain explores the impact of ‘generative’ technologies and ‘tethered applications’—considering everything from the Apple Mac and the iPhone to the One Laptop per Child programme.
Resumo:
“If Hollywood could order intellectual property laws for Christmas, what would they look like? This is pretty close.” David Fewer “While European and American IP maximalists have pushed for TRIPS-Plus provisions in FTAs and bilateral agreements, they are now pushing for TRIPS-Plus-Plus protections in these various forums.” Susan Sell “ACTA is a threat to the future of a free and open Internet.” Alexander Furnas “Implementing the agreement could open a Pandora's box of potential human rights violations.” Amnesty International. “I will not take part in this masquerade.” Kader Arif, Rapporteur for the Anti-Counterfeiting Trade Agreement 2011 in the European Parliament Executive Summary As an independent scholar and expert in intellectual property, I am of the view that the Australian Parliament should reject the adoption of the Anti-Counterfeiting Trade Agreement 2011. I would take issue with the Department of Foreign Affairs and Trade’s rather partisan account of the negotiations, the consultations, and the outcomes associated with the Anti-Counterfeiting Trade Agreement 2011. In my view, the negotiations were secretive and biased; the local consultations were sometimes farcical because of the lack of information about the draft texts of the agreement; and the final text of the Anti-Counterfeiting Trade Agreement 2011 is not in the best interests of Australia, particularly given that it is a net importer of copyright works and trade mark goods and services. I would also express grave reservations about the quality of the rather pitiful National Interest Analysis – and the lack of any regulatory impact statement – associated with the Anti-Counterfeiting Trade Agreement 2011. The assertion that the Anti-Counterfeiting Trade Agreement 2011 does not require legislative measures is questionable – especially given the United States Trade Representative has called the agreement ‘the highest-standard plurilateral agreement ever achieved concerning the enforcement of intellectual property rights.’ It is worthwhile reiterating that there has been much criticism of the secretive and partisan nature of the negotiations surrounding the Anti-Counterfeiting Trade Agreement 2011. Sean Flynn summarizes these concerns: "The negotiation process for ACTA has been a case study in establishing the conditions for effective industry capture of a lawmaking process. Instead of using the relatively transparent and inclusive multilateral processes, ACTA was launched through a closed and secretive “‘club approach’ in which like-minded jurisdictions define enforcement ‘membership’ rules and then invite other countries to join, presumably via other trade agreements.” The most influential developing countries, including Brazil, India, China and Russia, were excluded. Likewise, a series of manoeuvres ensured that public knowledge about the specifics of the agreement and opportunities for input into the process were severely limited. Negotiations were held with mere hours notice to the public as to when and where they would be convened, often in countries half away around the world from where public interest groups are housed. Once there, all negotiation processes were closed to the public. Draft texts were not released before or after most negotiating rounds, and meetings with stakeholders took place only behind closed doors and off the record. A public release of draft text, in April 2010, was followed by no public or on-the-record meetings with negotiators." Moreover, it is disturbing that the Anti-Counterfeiting Trade Agreement 2011 has been driven by ideology and faith, rather than by any evidence-based policy making Professor Duncan Matthews has raised significant questions about the quality of empirical evidence used to support the proposal of Anti-Counterfeiting Trade Agreement 2011: ‘There are concerns that statements about levels of counterfeiting and piracy are based either on customs seizures, with the actual quantities of infringing goods in free circulation in any particular market largely unknown, or on estimated losses derived from industry surveys.’ It is particularly disturbing that, in spite of past criticism, the Department of Foreign Affairs and Trade has supported the Anti-Counterfeiting Trade Agreement 2011, without engaging the Productivity Commission or the Treasury to do a proper economic analysis of the proposed treaty. Kader Arif, Rapporteur for the Anti-Counterfeiting Trade Agreement 2011 in the European Parliament, quit his position, and said of the process: "I want to denounce in the strongest possible manner the entire process that led to the signature of this agreement: no inclusion of civil society organisations, a lack of transparency from the start of the negotiations, repeated postponing of the signature of the text without an explanation being ever given, exclusion of the EU Parliament's demands that were expressed on several occasions in our assembly. As rapporteur of this text, I have faced never-before-seen manoeuvres from the right wing of this Parliament to impose a rushed calendar before public opinion could be alerted, thus depriving the Parliament of its right to expression and of the tools at its disposal to convey citizens' legitimate demands.” Everyone knows the ACTA agreement is problematic, whether it is its impact on civil liberties, the way it makes Internet access providers liable, its consequences on generic drugs manufacturing, or how little protection it gives to our geographical indications. This agreement might have major consequences on citizens' lives, and still, everything is being done to prevent the European Parliament from having its say in this matter. That is why today, as I release this report for which I was in charge, I want to send a strong signal and alert the public opinion about this unacceptable situation. I will not take part in this masquerade." There have been parallel concerns about the process and substance of the Anti-Counterfeiting Trade Agreement 2011 in the context of Australia. I have a number of concerns about the substance of the Anti-Counterfeiting Trade Agreement 2011. First, I am concerned that the Anti-Counterfeiting Trade Agreement 2011 fails to provide appropriate safeguards in respect of human rights, consumer protection, competition, and privacy laws. It is recommended that the new Joint Parliamentary Committee on Human Rights investigate this treaty. Second, I argue that there is a lack of balance to the copyright measures in the Anti-Counterfeiting Trade Agreement 2011 – the definition of piracy is overbroad; the suite of civil remedies, criminal offences, and border measures is excessive; and there is a lack of suitable protection for copyright exceptions, limitations, and remedies. Third, I discuss trade mark law, intermediary liability, and counterfeiting. I express my concerns, in this context, that the Anti-Counterfeiting Trade Agreement 2011 could have an adverse impact upon consumer interests, competition policy, and innovation in the digital economy. I also note, with concern, the lobbying by tobacco industries for the Anti-Counterfeiting Trade Agreement 2011 – and the lack of any recognition in the treaty for the capacity of countries to take measures of tobacco control under the World Health Organization Framework Convention on Tobacco Control. Fourth, I note that the Anti-Counterfeiting Trade Agreement 2011 provides no positive obligations to promote access to essential medicines. It is particularly lamentable that Australia and the United States of America have failed to implement the Doha Declaration on the TRIPS Agreement and Public Health 2001 and the WTO General Council Decision 2003. Fifth, I express concerns about the border measures in the Anti-Counterfeiting Trade Agreement 2011. Such measures lack balance – and unduly favour the interests of intellectual property owners over consumers, importers, and exporters. Moreover, such measures will be costly, as they involve shifting the burden of intellectual property enforcement to customs and border authorities. Interdicting, seizing, and destroying goods may also raise significant trade issues. Finally, I express concern that the Anti-Counterfeiting Trade Agreement 2011 undermines the role of existing international organisations, such as the United Nations, the World Intellectual Property Organization and the World Trade Organization, and subverts international initiatives such as the WIPO Development Agenda 2007. I also question the raison d'être, independence, transparency, and accountability of the proposed new ‘ACTA Committee’. In this context, I am concerned by the shift in the position of the Labor Party in its approach to international treaty-making in relation to intellectual property. The Australian Parliament adopted the Australia-United States Free Trade Agreement 2004, which included a large Chapter on intellectual property. The treaty was a ‘TRIPs-Plus’ agreement, because the obligations were much more extensive and prescriptive than those required under the multilateral framework established by the TRIPS Agreement 1994. During the debate over the Australia-United States Free Trade Agreement 2004, the Labor Party expressed the view that it would seek to mitigate the effects of the TRIPS-Plus Agreement, when at such time it gained power. Far from seeking to ameliorate the effects of the Australia-United States Free Trade Agreement 2004, the Labor Government would seek to lock Australia into a TRIPS-Double Plus Agreement – the Anti-Counterfeiting Trade Agreement 2011. There has not been a clear political explanation for this change in approach to international intellectual property. For both reasons of process and substance, I conclude that the Australian Parliament and the Australian Government should reject the Anti-Counterfeiting Trade Agreement 2011. The Australian Government would do better to endorse the Washington Declaration on Intellectual Property and the Public Interest 2011, and implement its outstanding obligations in respect of access to knowledge, access to essential medicines, and the WIPO Development Agenda 2007. The case study of the Anti-Counterfeiting Trade Agreement 2011 highlights the need for further reforms to the process by which Australia engages in international treaty-making.
Impacts of sodic soil amelioration on hydraulic conductivity and deep drainage in the Lower Burdekin
Resumo:
An understanding of the influence of soil chemistry on soil hydraulic properties is of critical importance for the management of sodic soils under irrigation. The hydraulic conductivity of sodic soils has been shown to be affected by properties of the applied solution including pH (Suarez et al. 1984), sodicity and salt concentration (McNeal and Coleman 1966). The changes in soil hydraulic conductivity are the result of changes in the spacing between clay layers in response to changes in soil solution chemistry. While the importance o f soil chemistry in controlling hydraulic conductivity is known, the exact impacts of sodic soil amelioration on hydraulic conductivity and deep drainage at a given location are difficult to predict. This is because the relationships between soil chemical factors and hydraulic conductivity are soil specific and because local site specific factors also need to be considered to determine the actual impacts on deep drainage rates.