980 resultados para Regulatory Models
Resumo:
The rise of the peer economy poses complex new regulatory challenges for policy-makers. The peer economy, typified by services like Uber and AirBnB, promises substantial productivity gains through the more efficient use of existing resources and a marked reduction in regulatory overheads. These services are rapidly disrupting existing established markets, but the regulatory trade-offs they present are difficult to evaluate. In this paper, we examine the peer economy through the context of ride-sharing and the ongoing struggle over regulatory legitimacy between the taxi industry and new entrants Uber and Lyft. We first sketch the outlines of ride-sharing as a complex regulatory problem, showing how questions of efficiency are necessarily bound up in questions about levels of service, controls over pricing, and different approaches to setting, upholding, and enforcing standards. We outline the need for data-driven policy to understand the way that algorithmic systems work and what effects these might have in the medium to long term on measures of service quality, safety, labour relations, and equality. Finally, we discuss how the competition for legitimacy is not primarily being fought on utilitarian grounds, but is instead carried out within the context of a heated ideological battle between different conceptions of the role of the state and private firms as regulators. We ultimately argue that the key to understanding these regulatory challenges is to develop better conceptual models of the governance of complex systems by private actors and the available methods the state has of influencing their actions. These struggles are not, as is often thought, struggles between regulated and unregulated systems. The key to understanding these regulatory challenges is to better understand the important regulatory work carried out by powerful, centralised private firms – both the incumbents of existing markets and the disruptive network operators in the peer-economy.
Resumo:
The Australian food system significantly contributes to a range of key environmental issues including harmful greenhouse gas emissions, air pollution, soil desertification, biodiversity loss and water scarcity. At the same time, the Australian s food system is a key cause of public health nutrition issues that stem from the co-existence of over- and under-consumption of dietary energy and nutrients. Within these challenges lie synergies and opportunities because a diet that has a lower environmental impact generally aligns with good nutrition. Australian State and Federal initiatives to influence food consumption patterns focus on individual body weight and ‘soft law’ interventions. These regulatory approaches, by focusing on select symptoms of food system failures, are fragmented, reductionist and inefficient. In order to illustrate this point, this paper will explore Australian regulatory responses to diet-related illnesses. The analysis will support the argument that only when regulatory responses to diets become embedded within reform of the current food system will substantial improvements to human and planetary health be achieved.
Resumo:
Stakeholders commonly agree that food systems need to be urgently reformed. Yet, how food systems should be reformed is extremely contested. Public international law and regulations are uniquely placed to influence and guide law, policy, programmes and action at regional, national and local levels. Although plenty of international legal instruments intersect with food-related issues, the international regulation of food systems is fragmented, understudied and contested. In order to address these issues, this paper maps and analyses the public international regulatory aspects of food production with a view to providing recommendations for reform. Accordingly, this paper brings together a variety of binding and non-binding international regulatory instruments that to varying degrees and from a range of angles deals with the first activity in the food system: food production. The following paper traces the regulatory tools from natural resources, to the farmers and farm workers that apply skill and experience, and finally to the different dimension of world trade in food. The various regulatory instruments identified, and their collective whole, will be analysed against a rights-based approach to food security.
Resumo:
This presentation discussed the growing recognition of sustainable diets at international governance levels and how this reflects the challenges and win-win opportunities of living within our ecological limits. I assert that sustainable diets provide an example of how living within our ecological limits would actually make us better off even apart from environmental benefits. After determining whether Australians’ generally have a sustainable diet, I outlined how Australian regulators are attempting to address sustainable diets. I argued that the personal responsibility approach coupled with the focus on preventing or reducing overweight and obesity levels are proving incapable of bringing about long-term sustainable diets that will contribute to the health and well-being of Australian people.
Resumo:
This article evaluates two policy initiatives by the United States Government to address access to essential medicines -- Priority Review vouchers and “Patents for Humanity." Such proposals are aimed at speeding up the regulatory review of inventions with humanitarian uses and applications by the United States Food and Drug Administration, and the United States Patent and Trademark Office. It is argued that such measures fall short of international standards and norms established by the World Intellectual Property Organization Development Agenda 2007; the World Trade Organization’s Doha Declaration on the TRIPS Agreement and Public Health 2001 and the WTO General Council Decision of August 30, 2003; and the World Health Organization’s declarations on intellectual property and public health. This article concludes that there is a need for broader patent law reform in the United States to address matters of patent law and public health. Moreover, there is a need to experiment with other, more promising alternative models of research and development -- such as medical innovation prizes, a Health Impact Fund, the Medicines Patent Pool, and Open Source Drug Discovery.
Resumo:
In the United States, there has been fierce debate over state, federal and international efforts to engage in genetically modified food labelling (GM food labelling). A grassroots coalition of consumers, environmentalists, organic farmers, and the food movement has pushed for law reform in respect of GM food labelling. The Just Label It campaign has encouraged United States consumers to send comments to the United States Food and Drug Administration to label genetically modified foods. This Chapter explores the various justifications made in respect of genetically modified food labelling. There has been a considerable effort to portray the issue of GM food labelling as one of consumer rights as part of ‘the right to know’. There has been a significant battle amongst farmers over GM food labelling – with organic farmers and biotechnology companies, fighting for precedence. There has also been a significant discussion about the use of GM food labelling as a form of environmental legislation. The prescriptions in GM food labelling regulations may serve to promote eco-labelling, and deter greenwashing. There has been a significant debate over whether GM food labelling may serve to regulate corporations – particularly from the food, agriculture, and biotechnology industries. There are significant issues about the interaction between intellectual property laws – particularly in respect of trade mark law and consumer protection – and regulatory proposals focused upon biotechnology. There has been a lack of international harmonization in respect of GM food labelling. As such, there has been a major use of comparative arguments about regulator models in respect of food labelling. There has also been a discussion about international law, particularly with the emergence of sweeping regional trade proposals, such as the Trans-Pacific Partnership, and the Trans-Atlantic Trade and Investment Partnership. This Chapter considers the United States debates over genetically modified food labelling – at state, federal, and international levels. The battles often involved the use of citizen-initiated referenda. The policy conflicts have been policy-centric disputes – pitting organic farmers, consumers, and environmentalists against the food industry and biotechnology industry. Such battles have raised questions about consumer rights, public health, freedom of speech, and corporate rights. The disputes highlighted larger issues about lobbying, fund-raising, and political influence. The role of money in United States has been a prominent concern of Lawrence Lessig in his recent academic and policy work with the group, Rootstrikers. Part 1 considers the debate in California over Proposition 37. Part 2 explores other key state initiatives in respect of GM food labelling. Part 3 examines the Federal debate in the United States over GM food labelling. Part 4 explores whether regional trade agreements – such as the Trans-Pacific Partnership (TPP) and the Trans-Atlantic Trade and Investment Partnership (TTIP) – will impact upon
Resumo:
Broad knowledge is required when a business process is modeled by a business analyst. We argue that existing Business Process Management methodologies do not consider business goals at the appropriate level. In this paper we present an approach to integrate business goals and business process models. We design a Business Goal Ontology for modeling business goals. Furthermore, we devise a modeling pattern for linking the goals to process models and show how the ontology can be used in query answering. In this way, we integrate the intentional perspective into our business process ontology framework, enriching the process description and enabling new types of business process analysis. © 2008 IEEE.
Resumo:
Speech recognition can be improved by using visual information in the form of lip movements of the speaker in addition to audio information. To date, state-of-the-art techniques for audio-visual speech recognition continue to use audio and visual data of the same database for training their models. In this paper, we present a new approach to make use of one modality of an external dataset in addition to a given audio-visual dataset. By so doing, it is possible to create more powerful models from other extensive audio-only databases and adapt them on our comparatively smaller multi-stream databases. Results show that the presented approach outperforms the widely adopted synchronous hidden Markov models (HMM) trained jointly on audio and visual data of a given audio-visual database for phone recognition by 29% relative. It also outperforms the external audio models trained on extensive external audio datasets and also internal audio models by 5.5% and 46% relative respectively. We also show that the proposed approach is beneficial in noisy environments where the audio source is affected by the environmental noise.
Resumo:
The 3D Water Chemistry Atlas is an intuitive, open source, Web-based system that enables the three-dimensional (3D) sub-surface visualization of ground water monitoring data, overlaid on the local geological model (formation and aquifer strata). This paper firstly describes the results of evaluating existing virtual globe technologies, which led to the decision to use the Cesium open source WebGL Virtual Globe and Map Engine as the underlying platform. Next it describes the backend database and search, filtering, browse and analysis tools that were developed to enable users to interactively explore the groundwater monitoring data and interpret it spatially and temporally relative to the local geological formations and aquifers via the Cesium interface. The result is an integrated 3D visualization system that enables environmental managers and regulators to assess groundwater conditions, identify inconsistencies in the data, manage impacts and risks and make more informed decisions about coal seam gas extraction, waste water extraction, and water reuse.
Resumo:
Appropriate selection of scaffold architecture is a key challenge in cartilage tissue engineering. Gap junction-mediated intercellular contacts play important roles in precartilage condensation of mesenchymal cells. However, scaffold architecture could potentially restrict cell-cell communication and differentiation. This is particularly important when choosing the appropriate culture platform as well as scaffold-based strategy for clinical translation, that is, hydrogel or microtissues, for investigating differentiation of chondroprogenitor cells in cartilage tissue engineering. We, therefore, studied the influence of gap junction-mediated cell-cell communication on chondrogenesis of bone marrow-derived mesenchymal stromal cells (BM-MSCs) and articular chondrocytes. Expanded human chondrocytes and BM-MSCs were either (re-) differentiated in micromass cell pellets or encapsulated as isolated cells in alginate hydrogels. Samples were treated with and without the gap junction inhibitor 18-α glycyrrhetinic acid (18αGCA). DNA and glycosaminoglycan (GAG) content and gene expression levels (collagen I/II/X, aggrecan, and connexin 43) were quantified at various time points. Protein localization was determined using immunofluorescence, and adenosine-5'-triphosphate (ATP) was measured in conditioned media. While GAG/DNA was higher in alginate compared with pellets for chondrocytes, there were no differences in chondrogenic gene expression between culture models. Gap junction blocking reduced collagen II and extracellular ATP in all chondrocyte cultures and in BM-MSC hydrogels. However, differentiation capacity was not abolished completely by 18αGCA. Connexin 43 levels were high throughout chondrocyte cultures and peaked only later during BM-MSC differentiation, consistent with the delayed response of BM-MSCs to 18αGCA. Alginate hydrogels and microtissues are equally suited culture platforms for the chondrogenic (re-)differentiation of expanded human articular chondrocytes and BM-MSCs. Therefore, reducing direct cell-cell contacts does not affect in vitro chondrogenesis. However, blocking gap junctions compromises cell differentiation, pointing to a prominent role for hemichannel function in this process. Therefore, scaffold design strategies that promote an increasing distance between single chondroprogenitor cells do not restrict their differentiation potential in tissue-engineered constructs.
Resumo:
This study compares Value-at-Risk (VaR) measures for Australian banks over a period that includes the Global Financial Crisis (GFC) to determine whether the methodology and parameter selection are important for capital adequacy holdings that will ultimately support a bank in a crisis period. VaR methodology promoted under Basel II was largely criticised during the GFC for its failure to capture downside risk. However, results from this study indicate that 1-year parametric and historical models produce better measures of VaR than models with longer time frames. VaR estimates produced using Monte Carlo simulations show a high percentage of violations but with lower average magnitude of a violation when they occur. VaR estimates produced by the ARMA GARCH model also show a relatively high percentage of violations, however, the average magnitude of a violation is quite low. Our findings support the design of the revised Basel II VaR methodology which has also been adopted under Basel III.
Resumo:
A model based on the cluster process representation of the self-exciting process model in White and Porter 2013 and Ruggeri and Soyer 2008is derived to allow for variation in the excitation effects for terrorist events in a self-exciting or cluster process model. The details of the model derivation and implementation are given and applied to data from the Global Terrorism Database from 2000–2012. Results are discussed in terms of practical interpretation along with implications for a theoretical model paralleling existing criminological theory.
Resumo:
We present a methodology to extract legal norms from regulatory documents for their formalisation and later compliance checking. The need for the methodology is motivated from the shortcomings of existing approaches where the rule type and process aspects relevant to the rules are largely overlook. The methodology incorporates the well–known IF. . . THEN structure extended with the process aspect and rule type, and guides how to properly extract the conditions and logical structure of the legal rules for reasoning and modelling of obligations for compliance checking.
Resumo:
Collective cell spreading is frequently observed in development, tissue repair and disease progression. Mathematical modelling used in conjunction with experimental investigation can provide key insights into the mechanisms driving the spread of cell populations. In this study, we investigated how experimental and modelling frameworks can be used to identify several key features underlying collective cell spreading. In particular, we were able to independently quantify the roles of cell motility and cell proliferation in a spreading cell population, and investigate how these roles are influenced by factors such as the initial cell density, type of cell population and the assay geometry.
Resumo:
In this paper it is demonstrated how the Bayesian parametric bootstrap can be adapted to models with intractable likelihoods. The approach is most appealing when the semi-automatic approximate Bayesian computation (ABC) summary statistics are selected. After a pilot run of ABC, the likelihood-free parametric bootstrap approach requires very few model simulations to produce an approximate posterior, which can be a useful approximation in its own right. An alternative is to use this approximation as a proposal distribution in ABC algorithms to make them more efficient. In this paper, the parametric bootstrap approximation is used to form the initial importance distribution for the sequential Monte Carlo and the ABC importance and rejection sampling algorithms. The new approach is illustrated through a simulation study of the univariate g-and- k quantile distribution, and is used to infer parameter values of a stochastic model describing expanding melanoma cell colonies.