958 resultados para Open Content
Resumo:
The only effective and scalable way to regulate the actions of people on the internet is through online intermediaries. These are the institutions that facilitate communication: internet service providers, search engines, content hosts, and social networks. Governments, private firms, and civil society organisations are increasingly seeking to influence these intermediaries to take more responsibility to prevent or respond to IP infringements. Around the world, intermediaries are increasingly subject to a variety of obligations to help enforce IP rights, ranging from informal social and governmental pressure, to industry codes and private negotiated agreements, to formal legislative schemes. This paper provides an overview of this emerging shift in regulatory approaches, away from legal liability and towards increased responsibilities for intermediaries. This shift straddles two different potential futures: an optimistic set of more effective, more efficient mechanisms for regulating user behaviour, and a dystopian vision of rule by algorithm and private power, without the legitimising influence of the rule of law.
Resumo:
Background Feeding practices are commonly examined as potentially modifiable determinants of children’s eating behaviours and weight status. Although a variety of questionnaires exist to assess different feeding aspects, many lack thorough reliability and validity testing. The Feeding Practices and Structure Questionnaire (FPSQ) is a tool designed to measure early feeding practices related to non-responsive feeding and structure of the meal environment. Face validity, factorial validity, internal reliability and cross-sectional correlations with children’s eating behaviours have been established in mothers with 2-year-old children. The aim of the present study was to further extend the validity of the FPSQ by examining factorial, construct and predictive validity, and stability. Methods Participants were from the NOURISH randomised controlled trial which evaluated an intervention with first-time mothers designed to promote protective feeding practices. Maternal feeding practices (FP) and child eating behaviours were assessed when children were aged 2 years and 3.7 years (n=388). Confirmatory Factor analysis, group differences, predictive relationships, and stability were tested. Results The original 9-factor structure was confirmed when children were aged 3.7±0.3 years. Cronbach’s alpha was above the recommended 0.70 cut-off for all factors except Structured Meal Timing, Over Restriction and Distrust in Appetite which were 0.58, 0.67 and 0.66 respectively. Allocated group differences reflected behaviour consistent with intervention content and all feeding practices were stable across both time points (range of r= 0.45-0.70). There was some evidence for the predictive validity of factors with 2 FP showing expected relationships, 2 FP showing expected and unexpected relationships and 5 FP showing no relationship. Conclusions Reliability and validity was demonstrated for most subscales of the FPSQ. Future validation is warranted with culturally diverse samples and with fathers and other caregivers. The use of additional outcomes to further explore predictive validity is recommended as well as testing construct validity and test-retest reliability of the questionnaire.
Resumo:
Background Most patients with minor stroke are discharged directly home from acute care, under the assumption that little will be required in the way of adaptation and adjustment because informal caregivers will manage the stroke recovery process. We explored male patients with minor stroke and their wife-caregivers' perceptions of factors affecting quality of life and caregiver strain encountered during the first year post-discharge. Methods Data were obtained from responses to two open-ended questions, part of quality of life and caregiver strain scales administered to participants in a larger descriptive study. Conventional content analysis was used to assess narrative accounts of living with minor stroke provided by 26 male patients and their wife-caregivers over a period of 1-year post-discharge. Results Two major themes that emerged from these data were 'being vulnerable' and 'realization'. Subthemes that arose within the vulnerability theme included changes to patients' masculine image and wife-caregivers' assumption of a hyper-vigilance role. In terms of 'realization' patients and their wife-caregivers shared 'loss' as well as 'changing self and relationships'. Patients in this study focused primarily on their physical recovery and their perceptions of necessary changes. Wife-caregivers were actively involved in managing the day-to-day demands that stroke placed on individual, family and social roles. Conclusions We conclude that patients and wife-caregivers expend considerable time and energy reestablishing control of their lives following minor stroke in an attempt to incorporate changes to self and their relationship into the fabric of their lives.
A Deweyan experience economy for higher education : The case of the Australian Indie 100 Music Event
Resumo:
In this essay we argue that a Deweyan experience economy will best support the higher education (HE) sector in the future, and we draw a contrast between that economy and the sector’s current focus on informational concerns, as expressed by the recent rush to Massive Open Online Courses (MOOCs) and other mass online informational offerings. We base our argument on current developments in music education and music technology that we see as being preemptive of wider trends. We use examples from a three-year study of online and offline music pedagogies and outline a four-year experiment in developing a pedagogical experience economy to illustrate a theoretical position informed by John Dewey’s theory of experience,Pierre Bourdieu’s theory of habitus and capital, and recent work in economic geography on epistemic communities. We argue further that the future of the HE sector is local rather than global, experiential rather than informational, and that therefore a continued informational approach to the future of HE risks undermining the sector.
Resumo:
Hydrothermal liquefaction (HTL) presents a viable route for converting a vast range of materials into liquid fuel, without the need for pre-drying. Currently, HTL studies produce bio-crude with properties that fall short of diesel or biodiesel standards. Upgrading bio-crude improves the physical and chemical properties to produce a fuel corresponding to diesel or biodiesel. Properties such as viscosity, density, heating value, oxygen, nitrogen and sulphur content, and chemical composition can be modified towards meeting fuel standards using strategies such as solvent extraction, distillation, hydrodeoxygenation and catalytic cracking. This article presents a review of the upgrading technologies available, and how they might be used to make HTL bio-crude into a transportation fuel that meets current fuel property standards.
Resumo:
Big Tobacco has been engaged in a dark, shadowy plot and conspiracy to hijack the Trans-Pacific Partnership Agreement (TPP) and undermine tobacco control measures – such as graphic health warnings and the plain packaging of tobacco products... In the context of this heavy lobbying by Big Tobacco and its proxies, this chapter provides an analysis of the debate over trade, tobacco, and the TPP. This discussion is necessarily focused on the negotiations of the free trade agreement – the shadowy conflicts before the finalisation of the text. This chapter contends that the trade negotiations threaten hard-won gains in public health – including international developments such as the WHO Framework Convention on Tobacco Control, and domestic measures, such as graphic health warnings and the plain packaging of tobacco products. It maintains that there is a need for regional trade agreements to respect the primacy of the WHO Framework Convention on Tobacco Control. There is a need both to provide for an open and transparent process regarding such trade negotiations, as well as a due and proper respect for public health in terms of substantive obligations. Part I focuses on the debate over the intellectual property chapter of the TPP, within the broader context of domestic litigation against Australia’s plain tobacco packaging regime and associated WTO disputes. Part II examines the investment chapter of the TPP, taking account of ongoing investment disputes concerning tobacco control and the declared approaches of Australia and New Zealand to investor-state dispute settlement. Part III looks at the discussion as to whether there should be specific text on tobacco control in the TPP, and, if so, what should be its nature and content. This chapter concludes that the plain packaging of tobacco products – and other best practices in tobacco control – should be adopted by members of the Pacific Rim.
Resumo:
On the 28th May 2014, a petition signed by 1.8 million people worldwide was delivered to the Australian Parliament to protest against the radical secrecy surrounding the Trans-Pacific Partnership.
Resumo:
On the 12th June 2014, Elon Musk, the chief executive officer of the electric car manufacturer, Tesla Motors, announced in a blog that ‘all our patents belong to you.’ He explained that the company would adopt an open source philosophy in respect of its intellectual property in order to encourage the development of the industry of electric cars, and address the carbon crisis. Elon Musk made the dramatic, landmark announcement: Yesterday, there was a wall of Tesla patents in the lobby of our Palo Alto headquarters. That is no longer the case. They have been removed, in the spirit of the open source movement, for the advancement of electric vehicle technology.
Resumo:
Background and purpose There are no published studies on the parameterisation and reliability of the single-leg stance (SLS) test with inertial sensors in stroke patients. Purpose: to analyse the reliability (intra-observer/inter-observer) and sensitivity of inertial sensors used for the SLS test in stroke patients. Secondary objective: to compare the records of the two inertial sensors (trunk and lumbar) to detect any significant differences in the kinematic data obtained in the SLS test. Methods Design: cross-sectional study. While performing the SLS test, two inertial sensors were placed at lumbar (L5-S1) and trunk regions (T7–T8). Setting: Laboratory of Biomechanics (Health Science Faculty - University of Málaga). Participants: Four chronic stroke survivors (over 65 yrs old). Measurement: displacement and velocity, Rotation (X-axis), Flexion/Extension (Y-axis), Inclination (Z-axis); Resultant displacement and velocity (V): RV=(Vx2+Vy2+Vz2)−−−−−−−−−−−−−−−−−√ Along with SLS kinematic variables, descriptive analyses, differences between sensors locations and intra-observer and inter-observer reliability were also calculated. Results Differences between the sensors were significant only for left inclination velocity (p = 0.036) and extension displacement in the non-affected leg with eyes open (p = 0.038). Intra-observer reliability of the trunk sensor ranged from 0.889-0.921 for the displacement and 0.849-0.892 for velocity. Intra-observer reliability of the lumbar sensor was between 0.896-0.949 for the displacement and 0.873-0.894 for velocity. Inter-observer reliability of the trunk sensor was between 0.878-0.917 for the displacement and 0.847-0.884 for velocity. Inter-observer reliability of the lumbar sensor ranged from 0.870-0.940 for the displacement and 0.863-0.884 for velocity. Conclusion There were no significant differences between the kinematic records made by an inertial sensor during the development of the SLS testing between two inertial sensors placed in the lumbar and thoracic regions. In addition, inertial sensors. Have the potential to be reliable, valid and sensitive instruments for kinematic measurements during SLS testing but further research is needed.
Resumo:
We and others have published on the rapid manufacture of micropellet tissues, typically formed from 100-500 cells each. The micropellet geometry enhances cellular biological properties, and in many cases the micropellets can subsequently be utilized as building blocks to assemble complex macrotissues. Generally, micropellets are formed from cells alone, however when replicating matrix-rich tissues such as cartilage it would be ideal if matrix or biomaterials supplements could be incorporated directly into the micropellet during the manufacturing process. Herein we describe a method to efficiently incorporate donor cartilage matrix into tissue engineered cartilage micropellets. We lyophilized bovine cartilage matrix, and then shattered it into microscopic pieces having average dimensions < 10 μm diameter; we termed this microscopic donor matrix "cartilage dust (CD)". Using a microwell platform, we show that ~0.83 μg CD can be rapidly and efficiently incorporated into single multicellular aggregates formed from 180 bone marrow mesenchymal stem/stromal cells (MSC) each. The microwell platform enabled the rapid manufacture of thousands of replica composite micropellets, with each micropellet having a material/CD core and a cellular surface. This micropellet organization enabled the rapid bulking up of the micropellet core matrix content, and left an adhesive cellular outer surface. This morphological organization enabled the ready assembly of the composite micropellets into macroscopic tissues. Generically, this is a versatile method that enables the rapid and uniform integration of biomaterials into multicellular micropellets that can then be used as tissue building blocks. In this study, the addition of CD resulted in an approximate 8-fold volume increase in the micropellets, with the donor matrix functioning to contribute to an increase in total cartilage matrix content. Composite micropellets were readily assembled into macroscopic cartilage tissues; the incorporation of CD enhanced tissue size and matrix content, but did not enhance chondrogenic gene expression.
Resumo:
Currently we are facing an overburdening growth of the number of reliable information sources on the Internet. The quantity of information available to everyone via Internet is dramatically growing each year [15]. At the same time, temporal and cognitive resources of human users are not changing, therefore causing a phenomenon of information overload. World Wide Web is one of the main sources of information for decision makers (reference to my research). However our studies show that, at least in Poland, the decision makers see some important problems when turning to Internet as a source of decision information. One of the most common obstacles raised is distribution of relevant information among many sources, and therefore need to visit different Web sources in order to collect all important content and analyze it. A few research groups have recently turned to the problem of information extraction from the Web [13]. The most effort so far has been directed toward collecting data from dispersed databases accessible via web pages (related to as data extraction or information extraction from the Web) and towards understanding natural language texts by means of fact, entity, and association recognition (related to as information extraction). Data extraction efforts show some interesting results, however proper integration of web databases is still beyond us. Information extraction field has been recently very successful in retrieving information from natural language texts, however it is still lacking abilities to understand more complex information, requiring use of common sense knowledge, discourse analysis and disambiguation techniques.
Resumo:
We present an empirical evaluation and comparison of two content extraction methods in HTML: absolute XPath expressions and relative XPath expressions. We argue that the relative XPath expressions, although not widely used, should be used in preference to absolute XPath expressions in extracting content from human-created Web documents. Evaluation of robustness covers four thousand queries executed on several hundred webpages. We show that in referencing parts of real world dynamic HTML documents, relative XPath expressions are on average significantly more robust than absolute XPath ones.
Resumo:
All media are social—they are after all media, in between, intermediating between producers and consumers of content, information, conversation, between the actors in the media and the audiences who read, listen, and watch. And the sociality of the media does not stop there: the processes of media production are social processes just as much as the activities of media audiencing. So strictly speaking, all media are social media. But only a particular subset of all media are fundamentally defined by their sociality, and thus distinguished from the mainstream media of print, radio, and television. It is the actual uses which are made of any medium which determine whether it is indeed a social medium—so let us investigate their roles in and interplay with the societies in which they operate.
Resumo:
Blasting is an integral part of large-scale open cut mining that often occurs in close proximity to population centers and often results in the emission of particulate material and gases potentially hazardous to health. Current air quality monitoring methods rely on limited numbers of fixed sampling locations to validate a complex fluid environment and collect sufficient data to confirm model effectiveness. This paper describes the development of a methodology to address the need of a more precise approach that is capable of characterizing blasting plumes in near-real time. The integration of the system required the modification and integration of an opto-electrical dust sensor, SHARP GP2Y10, into a small fixed-wing and multi-rotor copter, resulting in the collection of data streamed during flight. The paper also describes the calibration of the optical sensor with an industry grade dust-monitoring device, Dusttrak 8520, demonstrating a high correlation between them, with correlation coefficients (R2) greater than 0.9. The laboratory and field tests demonstrate the feasibility of coupling the sensor with the UAVs. However, further work must be done in the areas of sensor selection and calibration as well as flight planning.
Resumo:
This study presents a comprehensive mathematical model for open pit mine block sequencing problem which considers technical aspects of real-life mine operations. As the open pit block sequencing problem is an NP-hard, state-of-the-art heuristics algorithms, including constructive heuristic, local search, simulated annealing, and tabu search are developed and coded using MATLAB programming language. Computational experiments show that the proposed algorithms are satisfactory to solve industrial-scale instances. Numerical investigation and sensitivity analysis based on real-world data are also conducted to provide insightful and quantitative recommendations for mine schedulers and planners.