925 resultados para Data breach notification law
Resumo:
Integrated reporting (
Resumo:
This article examines a series of controversies within the life sciences over data sharing. Part 1 focuses upon the agricultural biotechnology firm Syngenta publishing data on the rice genome in the journal Science, and considers proposals to reform scientific publishing and funding to encourage data sharing. Part 2 examines the relationship between intellectual property rights and scientific publishing, in particular copyright protection of databases, and evaluates the declaration of the Human Genome Organisation that genomic databases should be global public goods. Part 3 looks at varying opinions on the information function of patent law, and then considers the proposals of Patrinos and Drell to provide incentives for private corporations to release data into the public domain.
Resumo:
This article considers whether the granting of patents in respect of biomedical genetic research should be conditional upon the informed consent of research participants. It focuses upon several case studies. In Moore v the Regents of the University Of California, a patient sued his physician for breach of fiduciary duty and lack of informed consent, because the doctor had obtained a patent on the patient's cell line, without the patient's authorisation. In Greenberg v Miami Children's Hospital, the research participants, the Greenbergs, the National Tay Sachs and Allied Diseases Association, and Dor Yeshorim brought a legal action against the geneticist Reubon Matalon and the Miami Children's Hospital over a patent obtained on a gene related to the Canavan disease and accompany genetic diagnostic test. PXE International entered into a joint venture with Charles Boyd and the University of Hawaii, and obtained a patent together for ‘methods for diagnosing Pseudoxanthoma elasticum’. In light of such case studies, it is contended that there is a need to reform patent law, so as to recognise the bioethical principles of informed consent and benefit-sharing. The 2005 UNESCO Declaration on Bioethics and Human Rights provides a model for future case law and policy-making.
Resumo:
The Trans-Pacific Partnership is a sweeping trade agreement, spanning the Pacific Rim, and covering an array of topics, including intellectual property. There has been much analysis of the recently leaked intellectual property chapter of the Trans-Pacific Partnership by WikiLeaks. Julian Assange, WikiLeaks’ Editor-in-Chief, observed “The selective secrecy surrounding the TPP negotiations, which has let in a few cashed-up megacorps but excluded everyone else, reveals a telling fear of public scrutiny. By publishing this text we allow the public to engage in issues that will have such a fundamental impact on their lives.” Critical attention has focused upon the lack of transparency surrounding the agreement, copyright law and the digital economy; patent law, pharmaceutical drugs, and data protection; and the criminal procedures and penalties for trade secrets. The topic of trade mark law and related rights, such as internet domain names and geographical indications, deserves greater analysis.
Resumo:
Empirical evidence in Australia and overseas has established that in many university disciplines, students begin to experience elevated levels of psychological distress in their first year of study. There is now a considerable body of empirical data that establishes that this is a significant problem for law students. Psychological distress may hamper a law student’s capacity to learn successfully, and certainly hinders their ability to thrive in the tertiary environment. We know from Self-Determination Theory (SDT), a conceptual branch of positive psychology, that supporting students’ autonomy in turn supports their well-being. This article seeks to connect the literature on law student well-being and independent learning using Self-Determination Theory (SDT) as the theoretical bridge. We argue that deliberate instruction in the development of independent learning skills in the first year curriculum is autonomy supportive. It can therefore lay the foundation for academic and personal success at university, and may be a protective factor against decline in law student psychological well-being.
Resumo:
Remedying the mischief of phoenix activity is of practical importance. The benefits include continued confidence in our economy, law that inspires best practice among directors, and law that is articulated in a manner such that penalties act as a sufficient deterrent and the regulatory system is able to detect offenders and bring them to account. Any further reforms must accommodate and tolerate legal phoenix activity. Phoenix activity pushes tolerance of entrepreneurial activity to its absolute limits. The wisest approach would be to front end the reforms so as to alleviate the considerable detection and enforcement burden upon regulatory bodies. There is little doubt that breach of the existing law is difficult and expensive to detect; and this is a significant burden when regulators have shrinking budgets and are rapidly losing feet on the ground. This front end approach may need to include restrictions on access to limited liability. The more limited liability is misused, the stronger the argument to limit access to limited liability. This paper proposes that such an approach is a legitimate next step for a robust and mature capitalist economy.
Resumo:
In a medical negligence context, and under the causation provisions enacted pursuant to Civil Liability Legislation in most Australian jurisdictions, the normative concept of “scope of liability” requires a consideration of whether or not and why a medical practitioner should be responsible for a patient’s harm. As such, it places a limit on the extent to which practitioners are deemed liable for a breach of the duty of care owed by them, in circumstances where a legal factual connection between that breach and the causation of a patient’s harm has already been shown. It has been said that a determination of causation requires ‘the identification and articulation of an evaluative judgement by reference to “the purposes and policy of the relevant part of the law”’: Wallace v Kam (2013) 297 ALR 383, 388. Accordingly, one of the normative factors falling within scope of liability is an examination of the content and purpose of the rule or duty of care violated – that is, its underlying policy and whether this supports an attribution of legal responsibility upon a practitioner. In this context, and with reference to recent jurisprudence, this paper considers: the policy relevant to a practitioner’s duty of care in each of the areas of diagnosis, treatment and advice; how this has been used to determine an appropriate scope of liability for the purpose of the causation inquiry in medical negligence claims; and whether such an approach is problematic for medical standards or decision-making.
Resumo:
Background: A genetic network can be represented as a directed graph in which a node corresponds to a gene and a directed edge specifies the direction of influence of one gene on another. The reconstruction of such networks from transcript profiling data remains an important yet challenging endeavor. A transcript profile specifies the abundances of many genes in a biological sample of interest. Prevailing strategies for learning the structure of a genetic network from high-dimensional transcript profiling data assume sparsity and linearity. Many methods consider relatively small directed graphs, inferring graphs with up to a few hundred nodes. This work examines large undirected graphs representations of genetic networks, graphs with many thousands of nodes where an undirected edge between two nodes does not indicate the direction of influence, and the problem of estimating the structure of such a sparse linear genetic network (SLGN) from transcript profiling data. Results: The structure learning task is cast as a sparse linear regression problem which is then posed as a LASSO (l1-constrained fitting) problem and solved finally by formulating a Linear Program (LP). A bound on the Generalization Error of this approach is given in terms of the Leave-One-Out Error. The accuracy and utility of LP-SLGNs is assessed quantitatively and qualitatively using simulated and real data. The Dialogue for Reverse Engineering Assessments and Methods (DREAM) initiative provides gold standard data sets and evaluation metrics that enable and facilitate the comparison of algorithms for deducing the structure of networks. The structures of LP-SLGNs estimated from the INSILICO1, INSILICO2 and INSILICO3 simulated DREAM2 data sets are comparable to those proposed by the first and/or second ranked teams in the DREAM2 competition. The structures of LP-SLGNs estimated from two published Saccharomyces cerevisae cell cycle transcript profiling data sets capture known regulatory associations. In each S. cerevisiae LP-SLGN, the number of nodes with a particular degree follows an approximate power law suggesting that its degree distributions is similar to that observed in real-world networks. Inspection of these LP-SLGNs suggests biological hypotheses amenable to experimental verification. Conclusion: A statistically robust and computationally efficient LP-based method for estimating the topology of a large sparse undirected graph from high-dimensional data yields representations of genetic networks that are biologically plausible and useful abstractions of the structures of real genetic networks. Analysis of the statistical and topological properties of learned LP-SLGNs may have practical value; for example, genes with high random walk betweenness, a measure of the centrality of a node in a graph, are good candidates for intervention studies and hence integrated computational – experimental investigations designed to infer more realistic and sophisticated probabilistic directed graphical model representations of genetic networks. The LP-based solutions of the sparse linear regression problem described here may provide a method for learning the structure of transcription factor networks from transcript profiling and transcription factor binding motif data.
Resumo:
Criminological theories of cross-national studies of homicide have underestimated the effects of quality governance of liberal democracy and region. Data sets from several sources are combined and a comprehensive model of homicide is proposed. Results of the spatial regression model, which controls for the effect of spatial autocorrelation, show that quality governance, human development, economic inequality, and ethnic heterogeneity are statistically significant in predicting homicide. In addition, regions of Latin America and non-Muslim Sub-Saharan Africa have significantly higher rates of homicides ceteris paribus while the effects of East Asian countries and Islamic societies are not statistically significant. These findings are consistent with the expectation of the new modernization and regional theories.
Resumo:
As part of the 2014 amendments to the Youth Justice Act 1992 (Qld) the previous Queensland government introduced a new breach of bail offence and a reverse onus provision in relation to the new offence. Also included in the raft of amendments was a provision removing the internationally accepted principle that, in relation to young offenders, detention should be used as ‘a last resort’. This article argues that these changes are likely to increase the entrenchment of young people within the criminal justice system.
Resumo:
Fatigue and sleepiness are major causes of road traffic accidents. However, precise data is often lacking because a validated and reliable device for detecting the level of sleepiness (cf. the breathalyzer for alcohol levels) does not exist, nor does criteria for the unambiguous detection of fatigue/sleepiness as a contributing factor in accident causation. Therefore, identification of risk factors and groups might not always be easy. Furthermore, it is extremely difficult to incorporate fatigue in operationalized terms into either traffic or criminal law. The main aims of this thesis were to estimate the prevalence of fatigue problems while driving among the Finnish driving population, to explore how VALT multidisciplinary investigation teams, Finnish police, and courts recognize (and prosecute) fatigue in traffic, to identify risk factors and groups, and finally to explore the application of the Finnish Road Traffic Act (RTA), which explicitly forbids driving while tired in Article 63. Several different sources of data were used: a computerized database and the original folders of multidisciplinary teams investigating fatal accidents (VALT), the driver records database (AKE), prosecutor and court decisions, a survey of young male military conscripts, and a survey of a representative sample of the Finnish active driving population. The results show that 8-15% of fatal accidents during 1991-2001 were fatigue related, that every fifth Finnish driver has fallen asleep while driving at some point during his/her driving career, and that the Finnish police and courts punish on average one driver per day on the basis of fatigued driving (based on the data from the years 2004-2005). The main finding regarding risk factors and risk groups is that during the summer months, especially in the afternoon, the risk of falling asleep while driving is increased. Furthermore, the results indicate that those with a higher risk of falling asleep while driving are men in general, but especially young male drivers including military conscripts and the elderly during the afternoon hours and the summer in particular; professional drivers breaking the rules about duty and rest hours; and drivers with a tendency to fall asleep easily. A time-of-day pattern of sleep-related incidents was repeatedly found. It was found that VALT teams can be considered relatively reliable when assessing the role of fatigue and sleepiness in accident causation; thus, similar experts might be valuable in the court process as expert witnesses when fatigue or sleepiness are suspected to have a role in an accident’s origins. However, the application of Article 63 of the RTA that forbids, among other things, fatigued driving will continue to be an issue that deserves further attention. This should be done in the context of a needed attitude change towards driving while in a state of extreme tiredness (e.g., after being awake for more than 24 hours), which produces performance deterioration comparable to illegal intoxication (BAC around 0.1%). Regarding the well-known interactive effect of increased sleepiness and even small alcohol levels, the relatively high proportion (up to 14.5%) of Finnish drivers owning and using a breathalyzer raises some concern. This concern exists because these drivers are obviously more focused on not breaking the “magic” line of 0.05% BAC than being concerned about driving impairment, which might be much worse than they realize because of the interactive effects of increased sleepiness and even low alcohol consumption. In conclusion, there is no doubt that fatigue and sleepiness problems while driving are common among the Finnish driving population. While we wait for the invention of reliable devices for fatigue/sleepiness detection, we should invest more effort in raising public awareness about the dangerousness of fatigued driving and educate drivers about how to recognize and deal with fatigue and sleepiness when they ultimately occur.
Resumo:
Learner and first year probationary motorcyclists are over-represented in traffic accidents, being involved about four times as often as full motorcycle licence holders in relation to their numbers. In an attempt to reduce this over-involvement, the Victorian Government amended the law in 1979 to restrict learner and first year probationary motorcyclists to motorcycles with engine capacities of less than 260 cc. This paper reports an evaluation which showed that casualty rates for learner and first year probationers began to decrease from mid 1979 and continued to do so until the end of 1980. A further analysis indicated that compared to full licence holder casualties, learner permit casualties were about 40% less than expected while first year probationary casualties were about 39% lower.
Resumo:
Regression ra tes of a hypergolic combination of fuel and oxidiser have been experimentally measured as a function of chamber pressure, mass flux and the percentage component of the hypergolic compound in natural rubber. The hypergolic compound used is difurfurylidene cyclohexanone (DFCH) which is hypergolic with the oxidiser red fuming nitric acid (RFNA) with ignition dela y of 60-70 ms. The data of weight loss versus time is obtained for burn times varying between 5 and 20 seconds. Two methods of correlating the data using mass flux of oxidiser and the total flux of hot gases have shown that index n of the regression law r=aGoxn or r=aGnxn-1 (x the axial distance) is about 0.5 or a little lower and not 0.8 even though the flow through the port is turbulent. It is argued that the reduction of index n is due to heterogeneous reaction between the liquid oxidiser and the hypergolic fuel component on the surface.
Resumo:
For many, particularly in the Anglophone world and Western Europe, it may be obvious that Google has a monopoly over online search and advertising and that this is an undesirable state of affairs, due to Google's ability to mediate information flows online. The baffling question may be why governments and regulators are doing little to nothing about this situation, given the increasingly pivotal importance of the internet and free flowing communications in our lives. However, the law concerning monopolies, namely antitrust or competition law, works in what may be seen as a less intuitive way by the general public. Monopolies themselves are not illegal. Conduct that is unlawful, i.e. abuses of that market power, is defined by a complex set of rules and revolves principally around economic harm suffered due to anticompetitive behavior. However the effect of information monopolies over search, such as Google’s, is more than just economic, yet competition law does not address this. Furthermore, Google’s collection and analysis of user data and its portfolio of related services make it difficult for others to compete. Such a situation may also explain why Google’s established search rivals, Bing and Yahoo, have not managed to provide services that are as effective or popular as Google’s own (on this issue see also the texts by Dirk Lewandowski and Astrid Mager in this reader). Users, however, are not entirely powerless. Google's business model rests, at least partially, on them – especially the data collected about them. If they stop using Google, then Google is nothing.
Resumo:
Volumetric method based adsorption measurements of nitrogen on two specimens of activated carbon (Fluka and Sarabhai) reported by us are refitted to two popular isotherms, namely, Dubunin−Astakhov (D−A) and Toth, in light of improved fitting methods derived recently. Those isotherms have been used to derive other data of relevance in design of engineering equipment such as the concentration dependence of heat of adsorption and Henry’s law coefficients. The present fits provide a better representation of experimental measurements than before because the temperature dependence of adsorbed phase volume and structural heterogeneity of micropore distribution have been accounted for in the D−A equation. A new correlation to the Toth equation is a further contribution. The heat of adsorption in the limiting uptake condition is correlated with the Henry’s law coefficients at the near zero uptake condition.