14 resultados para music technology
em Biblioteca Digital da Produção Intelectual da Universidade de São Paulo (BDPI/USP)
Resumo:
In this paper, we initially present an algorithm for automatic composition of melodies using chaotic dynamical systems. Afterward, we characterize chaotic music in a comprehensive way as comprising three perspectives: musical discrimination, dynamical influence on musical features, and musical perception. With respect to the first perspective, the coherence between generated chaotic melodies (continuous as well as discrete chaotic melodies) and a set of classical reference melodies is characterized by statistical descriptors and melodic measures. The significant differences among the three types of melodies are determined by discriminant analysis. Regarding the second perspective, the influence of dynamical features of chaotic attractors, e.g., Lyapunov exponent, Hurst coefficient, and correlation dimension, on melodic features is determined by canonical correlation analysis. The last perspective is related to perception of originality, complexity, and degree of melodiousness (Euler's gradus suavitatis) of chaotic and classical melodies by nonparametric statistical tests. (c) 2010 American Institute of Physics. [doi: 10.1063/1.3487516]
Resumo:
This text discusses the phonographic segment of religious music in Brazil in its two main manifestations, linked respectively to the Catholic and Protestant traditions. The text offers a brief history of both traditions, as well as a description of their main recording companies and artists of greatest prominence. In its final part. the text presents the strategies that bring together recording companies and independent artists, as well as ponders over Brazil`s independent musical production as a whole.
Resumo:
There are several tools in the literature that support innovation in organizations. Some of the most cited are the so-called technology roadmapping methods, also known as TRM. However, these methods are designed primarily for organizations that adopt the market pull strategy of technology-product integration. Organizations that adopt the technology push integration strategy are neglected in the literature. Furthermore, with the advent of open innovation, it is possible to note the need to consider the adoption of partnerships in the innovation process. Thus, this study proposes a method of technology roadmapping, identified as method for technology push (MTP), applicable to organizations that adopt the technology push integration strategy, such as SMEs and independent research centers in an open-innovation environment. The method was developed through action-research and was assessed from two analytical standpoints: externally, via a specific literature review on its theoretical contributions, and internally, through the analysis of potential users` perceptions on the feasibility of applying MTP. The results indicate both the unique character of the method and its perceived implementation feasibility. Future research is suggested in order to validate the method in different types of organizations (C) 2011 Elsevier Ltd. All rights reserved.
Resumo:
Many authors point out that the front-end of new product development (NPD) is a critical success factor in the NPD process and that numerous companies face difficulties in carrying it out appropriately. Therefore, it is important to develop new theories and proposals that support the effective implementation of this earliest phase of NPD. This paper presents a new method to support the development of front-end activities based on integrating technology roadmapping (TRM) and project portfolio management (PPM). This new method, called the ITP Method, was implemented at a small Brazilian high-tech company in the nanotechnology industry to explore the integration proposal. The case study demonstrated that the ITP Method provides a systematic procedure for the fuzzy front-end and integrates innovation perspectives into a single roadmap, which allows for a better alignment of business efforts and communication of product innovation goals. Furthermore, the results indicated that the method may also improve quality, functional integration and strategy alignment. (C) 2010 Elsevier Inc. All rights reserved.
Resumo:
This paper presents a proposal for a Quality Management System for a generic GNSS Surveying Company as an alternative for management and service quality improvements. As a result of the increased demand for GNSS measurements, a large number of new or restructured companies were established to operate in that market. Considering that GNSS surveying is a new process, some changes must be performed in order to accommodate the old surveying techniques and the old fashioned management to the new reality. This requires a new management model that must be based on a well-described procedure sequence aiming at the Total Management Quality for the company. The proposed Quality Management System was based on the requirements of the Quality System ISO 9000:2000, applied to the whole company, focusing on the productive process of GNSS surveying work.
Resumo:
Considering the increasing popularity of network-based control systems and the huge adoption of IP networks (such as the Internet), this paper studies the influence of network quality of service (QoS) parameters over quality of control parameters. An example of a control loop is implemented using two LonWorks networks (CEA-709.1) interconnected by an emulated IP network, in which important QoS parameters such as delay and delay jitter can be completely controlled. Mathematical definitions are provided according to the literature, and the results of the network-based control loop experiment are presented and discussed.
Resumo:
This work presents a case study on technology assessment for power quality improvement devices. A system compatibility test protocol for power quality mitigation devices was developed in order to evaluate the functionality of three-phase voltage restoration devices. In order to validate this test protocol, the micro-DVR, a reduced power development platform for DVR (dynamic voltage restorer) devices, was tested and the results are discussed based on voltage disturbances standards. (C) 2011 Elsevier B.V. All rights reserved.
Resumo:
A green ceramic tape micro-heat exchanger was developed using Low Temperature Co-fired Ceramics technology (LTCC). The device was designed by using Computational Aided Design software and simulations were made using a Computational Fluid Dynamics package (COMSOL Multiphysics) to evaluate the homogeneity of fluid distribution in the microchannels. Four geometries were proposed and simulated in two and three dimensions to show that geometric details directly affect the distribution of velocity in the micro-heat exchanger channels. The simulation results were quite useful for the design of the microfluidic device. The micro-heat exchanger was then constructed using the LTCC technology and is composed of five thermal exchange plates in cross-flow arrangement and two connecting plates, with all plates stacked to form a device with external dimensions of 26 x 26 x 6 mm(3).
Resumo:
The application of airborne laser scanning (ALS) technologies in forest inventories has shown great potential to improve the efficiency of forest planning activities. Precise estimates, fast assessment and relatively low complexity can explain the good results in terms of efficiency. The evolution of GPS and inertial measurement technologies, as well as the observed lower assessment costs when these technologies are applied to large scale studies, can explain the increasing dissemination of ALS technologies. The observed good quality of results can be expressed by estimates of volumes and basal area with estimated error below the level of 8.4%, depending on the size of sampled area, the quantity of laser pulses per square meter and the number of control plots. This paper analyzes the potential of an ALS assessment to produce certain forest inventory statistics in plantations of cloned Eucalyptus spp with precision equal of superior to conventional methods. The statistics of interest in this case were: volume, basal area, mean height and dominant trees mean height. The ALS flight for data assessment covered two strips of approximately 2 by 20 Km, in which clouds of points were sampled in circular plots with a radius of 13 m. Plots were sampled in different parts of the strips to cover different stand ages. The clouds of points generated by the ALS assessment: overall height mean, standard error, five percentiles (height under which we can find 10%, 30%, 50%,70% and 90% of the ALS points above ground level in the cloud), and density of points above ground level in each percentile were calculated. The ALS statistics were used in regression models to estimate mean diameter, mean height, mean height of dominant trees, basal area and volume. Conventional forest inventory sample plots provided real data. For volume, an exploratory assessment involving different combinations of ALS statistics allowed for the definition of the most promising relationships and fitting tests based on well known forest biometric models. The models based on ALS statistics that produced the best results involved: the 30% percentile to estimate mean diameter (R(2)=0,88 and MQE%=0,0004); the 10% and 90% percentiles to estimate mean height (R(2)=0,94 and MQE%=0,0003); the 90% percentile to estimate dominant height (R(2)=0,96 and MQE%=0,0003); the 10% percentile and mean height of ALS points to estimate basal area (R(2)=0,92 and MQE%=0,0016); and, to estimate volume, age and the 30% and 90% percentiles (R(2)=0,95 MQE%=0,002). Among the tested forest biometric models, the best fits were provided by the modified Schumacher using age and the 90% percentile, modified Clutter using age, mean height of ALS points and the 70% percentile, and modified Buckman using age, mean height of ALS points and the 10% percentile.
Resumo:
Managing a variable demand scenario is particularly challenging on services organizations because services companies usually have a major part of fixed costs. The article studies how a services organization manages its demand variability and its relation with the organization`s profitability. Moreover, the study searched for alternatives used to reduce the demand variability`s impact on the profitability of the company. The research was based on a case study with a Brazilian services provider on information technology business. The study suggests that alternatives like using outsourced employees to cover demand peaks may bring benefits only on short term, reducing the profitability of the company on long term: Some options are revealed, like the internationalization of employees and the investment on developing its own workforce.
Resumo:
The present study used a temporal bisection task to investigate whether music affects time estimation differently from a matched auditory neutral stimulus, and whether the emotional valence of the musical stimuli (i.e., sad vs. happy music) modulates this effect. The results showed that, compared to sine wave control music, music presented in a major (happy) or a minor (sad) key shifted the bisection function toward the right, thus increasing the bisection point value (point of subjective equality). This indicates that the duration of a melody is judged shorter than that of a non-melodic control stimulus, thus confirming that ""time flies"" when we listen to music. Nevertheless, sensitivity to time was similar for all the auditory stimuli. Furthermore, the temporal bisection functions did not differ as a function of musical mode. (C) 2010 Elsevier B.V. All rights reserved.
Resumo:
Background The development of products and services for health care systems is one of the most important phenomena to have occurred in the field of health care over the last 50 years. It generates significant commercial, medical and social results. Although much has been done to understand how health technologies are adopted and regulated in developed countries, little attention has been paid to the situation in low- and middle-income countries (LMICs). Here we examine the institutional environment in which decisions are made regarding the adoption of expensive medical devices into the Brazilian health care system. Methods We used a case study strategy to address our research question. The empirical work relied on in-depth interviews (N = 16) with representatives of a wide range of actors and stakeholders that participate in the process of diffusion of CT (computerized tomography) scanners in Brazil, including manufacturers, health care organizations, medical specialty societies, health insurance companies, regulatory agencies and the Ministry of Health. Results The adoption of CT scanners is not determined by health policy makers or third-party payers of public and private sectors. Instead, decisions are primarily made by administrators of individual hospitals and clinics, strongly influenced by both physicians and sales representatives of the medical industry who act as change agents. Because this process is not properly regulated by public authorities, health care organizations are free to decide whether, when and how they will adopt a particular technology. Conclusions Our study identifies problems in how health care systems in LMICs adopt new, expensive medical technologies, and suggests that a set of innovative approaches and policy instruments are needed in order to balance the institutional and professional desire to practise a modern and expensive medicine in a context of health inequalities and basic health needs.
Resumo:
The objectives of this study were to check music and voice message influence on vital signs and facial expressions of patients with disorders of consciousness and to connect the existence of patient`s responses with the Glasgow Coma Scale or with the Ramsay Sedation Scale. The method was a single-blinded randomized controlled clinical trial with 30 patients, from two intensive care units, being divided into two groups (control and experimental). Their relatives recorded a voice message and chose a song according to the patient`s preference. The patients were submitted to three sessions for three consecutive days. Significant statistical alterations of the vital signs were noted during the message playback (oxygen saturation-Day 1 and Day 3; respiratory frequency-Day 3) and with facial expression, on Day 1, during both music and message. The conclusion was that the voice message was a stronger stimulus than the music.
Resumo:
Audiometry is the main way with which hearing is evaluated, because it is a universal and standardized test. Speech tests are difficult to standardize due to the variables involved, their performance in the presence of competitive noise is of great importance. Aim: To characterize speech intelligibility in silence and in competitive noise from individuals exposed to electronically amplified music. Material and Method: It was performed with 20 university students who presented normal hearing thresholds. The speech recognition rate (SRR) was performed after fourteen hours of sound rest after the exposure to electronically amplified music and once again after sound rest, being studied in three stages: without competitive noise, in the presence of Babble-type competitive noise, in monotic listening, in signal/ noise ratio of + 5 dB and with the signal/ noise ratio of 5 dB. Results: There was greater damage in the SRR after exposure to the music and with competitive noise, and as the signal/ noise ratio decreases, the performance of individuals in the test also decreased. Conclusion: The inclusion of competitive noise in the speech tests in the audiological routine is important, because it represents the real disadvantage experienced by individuals in daily listening.