Sie sind auf Seite 1von 41

Article 2

Beginning Current and Future of


Food Engineering Biology article
Published: 23, March 2015

Food Engineering is a field which is a unique integration of various core fields like
Electrical, Mechanical, Civil, Chemical, Industrial and Agriculture Engineering. All
of the aforementioned Engineering fields have some critical role to play to
execute various operations and processes in the Food Industry. Various
definitions of Food Engineering have evolved since 1952 and majority of them in
general strongly bonds the Science and Engineering Principles to control various
Food materials. Hence the dimension of this interface has over the time
continued to grow in various fields like of Education, Research and Industries.
The origin of Food Engineering initiated since early 1800's which comprised
mainly of quantification and modeling for food preservation. Subsequently,
various Mathematical, Heat and Mass Transfer Modeling in food process industry
were developed to continuously increase the shelf life and reduce spoilage.
Consequently, three fields mainly Thermal Processing, Food Cooling and
Freezing and Food Dehydration turned out to be key research areas in Food
Engineering. Various food giants in Food Engineering had contributed to Food
Engineering field since the start of this 20th century. Additionally, various books
have been authored and published by these food giants that have greatly
enhanced the education system in the field. Further, various conferences,
journals and food engineering societies have been formed and promoted since
then which has greatly contributed to the enlargement and efficacy of the field in
the present world. Few changes that have been envisioned in the industry since
then are like firstly, mergers and acquisitions of companies in food and related
industries. Secondly, implementation of new and innovative processes in the
industry. Thirdly, changes in expectations of food industry to a more healthy food
practices. Fourthly, advancement in computer technology and its role in food
engineering. These changes have influenced the environment of food engineers
and will be incorporated in future educational programs of food engineers.

In food engineering the key research focus area is Integration of kinetics with
transport phenomenon to accomplish process design. The research model
concept encircles in three areas namely Kinetic parameters, transport
phenomenon, and process design. Research on kinetic model has increased the
availability of kinetic parameters and transport phenomenon has improved
understanding of heat and mass transfer in foods. In general, the concept is ideal
for optimizing product quality and process parameters. Hence, in general the
model in flowchart could be presented as:
In Kinetic model:

dA/dT=-kA where;
k=first order rate constant
A=the concentration or intensity of component.
The rate constants were initially interpreted in food preservation processes but in
this case are evaluated and expressed as function of temperature, water activity,
pressure, pH etc.
Similarly, in Transport phenomenon model:
N/t = -2 N where;
N=intensity of a process parameter
= appropriate property of food structure.
The expression is used to predict preservation process parameters like
temperature, moisture content, pressure and other parameters in microbial
populations.
Likewise, in Process design model:
A=(N)dt where;
A=intensity or concentration of quality attribute.
The focus of this design model is to assure maximum retention of product quality
attributes while ensuring desired reduction in microbial population for safety of
shelf life extension.
To produce the safer foods Knorr has demonstrated the kinetics of Escherichia
Coli population reductions at high pressures and henceforth retention of product
quality in sausage. Besides food processing food supply is another critical factor
in food industry. Process improvements like aspetic packaging and processing,
better throughput of processing lines and noticeable reduction in product quality
attributes has been a major contributor in efficient and cost effective food supply
chain. A process flow for aspetic processing and packaging is as shown:
Similarly, freeze drying is another food preservation method to increase shelf life
and give better quality to food product. Continuous efforts for evaluation are
being carried out to reduce the costs associated with food processing.
Furthermore improvements in texture of food have been researched upon
recently. Concept of "Water activity "has uniquely provided basis for product and
process development. Many relationships between reaction rates in foods and
water activity has guided development of various relationships between weight
composition, food ingredients and chemistry of food versus processing
temperature. Most of the product quality attributes could be now traced in "Food
stability map". Another quantitative concept could be found in "State Diagram".
This has very well guided the food industry with enhanced knowledge on process
parameters to achieve best product quality and increased shelf life. Sample
diagrams for the aforementioned items are shown as below.

Food Stability Map


State Diagram
Extrusion in another process technology with significant impact on food product
development. It combines various parameters like pressure, temperature and
time to produce products with unique characteristics and properties. The process
has been simulated in state diagram and various stages like wetting, heating and
mixing, expansion, drying and cooling has been formulated to see the chemical
and physical changes in the cycle. This understanding created unlimited array of
opportunities for new and innovative products. Through nano scale science
various textures of food could be improved. The encapsulation of flavors within
nanoscale particles ensures retention of flavor intensity through processing steps
for long periods of time during storage and distribution. The future of food
engineering will critically revolve around nanoscale science and translation of its
outcomes into food industry. The shelf-life could be extended and new and
improved packaging could be established using nano scale materials. Food
Engineering will play a vital role in establishing robust health and wellness care
for the consumers. It will provide good insight into fundamentals to the
metabolism of the food and eventually the product and process development
cycle. It is no doubt that material and energy balances will become standard tools
in evaluation of all operations initiating from point of raw material production to
final delivery of the product. The continuous research in preservation process like
of refrigeration, drying, canning etc has been a focus in the field and transition of
new information from process design to commercial manufacturing in form of
technology transfer will continue to be a great challenge and biggest focus of
food engineering.

List and briefly describe the contribution of


10 food engineers from around the world that
have been relevant to the profession (500 to
1000 words). Please do not cut and paste!!!
Karel Marcus: He is the primary leader in finding out the effect of molecular
weight on glass transition. A paper "Glass Transition in Poly (Vinyl Pyrrolidone)
effect of molecular weight and diluents" published in 1992 in Biotechnology
program mentions about this relationship. The relationship in present literature is
also known as state diagram.
Marcel Loncin: He was chemical engineer and has done significant contribution in
emulsion and drying technologies in food processing industry. This included
drying in various forms and separation equipment design to implement
aforementioned technologies. Additionally, some works on influence of surface

active agents on the drying rate of solids have also been published. There has
been also a research prize named after Marcel Locin which is awarded annually
by Institute of food technology.
Ted Labuza: He has done significant contribution in water activity and practical
aspects of moisture sorption isotherm measurement and its use. He was able to
determine that it is actually the water activity and not the moisture content that is
the root cause of the problem pertaining to shelf life loss, texture changes etc.
d) Dennis R.Heldman: He has been a leading contributor in the field by
establishing novel researches in freezing processes to maximize food product
quality. Various relationships of surface heat transfer coefficient to product
dimensions and characteristics has been determined through his researches.
Influence on freezing rates, storage temperatures and thawing conditions on
quality of frozen foods has also been provided under the leadership of Dr.
Heldman.
e) Q.Tuan Pham: He has contributed to the field of food engineering through
optimizing the refrigeration processes in the food industry. He has also provided
prediction on calorimetric properties and freezing time of foods from composition
data. In these formulations various effects like sensible heat by calculating
precooling, phase change and sub cooling times are also considered.
f) J.Kuprianoff: He has contributed greatly in food industry by distinguishing the
various forms of water like bound, free etc. inside the food which largely
contributes to water activity or moisture content in food complex systems. Such
types also critically comment on the food rehydration and dehydration cycle.
g) H.A.Leniger: He is the pioneers to integrate chemical engineering with the food
complex systems to form food engineering. Study of chemical reactions initiated
by addition of chemicals with raw and auxiliary materials has been researched
and developed by Dr. Leniger. He has also studied the complexities of Newtonian
flow in pipes and ducts. Besides, similar concepts in high temperature during
short time extrusion cooking have also been applied. Such processes are
considered to be thermal processing and its criticality to food quality.

h) T.Lovric: He has been the leading researcher on biochemical and processing


aspects of food. For example lycopene has been detected in tomatoes which are
responsible for color of the vegetable using spectrophotometry. Further, aroma of
Rhine Riesling and wine was also established to determine most intense odoractive compounds using Gas chromatography. He has also studied the
rheological properties of aqueous solutions of food additives. Additionally, study
of phenolic compounds that contribute majorly to aroma and taste of numerous
food products was also established by Dr. Lovric.
i) R.P.Singh: He has been the leading researchers in integrating computer
simulation with food engineering. He has worked on numerous projects ranging
from thawing and drying simulation techniques to degradation of quality of stored
food in shelf using computer analysis softwares. Such simulations also tell about
various other characteristics like sensory properties, water activity to approximate

shelf life etc. Dr. Singh has also played a vital role in establishing robust grounds
for wheat production across the globe.
j) D.B.Lund: He has been the leading researchers in establishing the sterilizing
temperatures of the food product. On the similar lines he has developed kinetics
of thermal softening of food and corresponding physical changes. He has
published materials on the textural degradation via use of activation energies
which is induced when vegetable is kept in thermal environment. With a same
view he has also predicted the quality factor retention in both conduction and
convection methods.

***************************************************
****************
Please list and briefly describe 5 important
concepts introduce by food engineers and
closeley related colleagues. This concepts
should be or relevance because of the
changes promoted and the impact in the
food industry and the analysis of engineering
problems.
Important concepts:
Molecular weight Versus Glass Transition (State Diagrams): This concept was
largely developed by Karel Marcus. It basically mentions the effect of molecular
weight on glass transition. Any given food complex has weight fraction of solids
which influences the melting and boiling points of the food This eventually effects
the glass transition temperature of the food and signifies the content of water or
water activity within the food. All these concepts are put forth in a single diagram
called the State diagrams.
Water activity and practical aspects of moisture sorption isotherm measurement
and its use:- This concept was largely developed by Ted Labuza and
J.Kuprianoff. It states that when the water is present in the foods; it enters the
glassy state, and the movement of water is so slow that it is effectively bound.
Water potential is a quantitative measure of the binding energy of water in food.
Common moisture sorption isotherms have been developed that explain various
effects on moisture content and water activities of the product through
temperature of the surroundings. It also illustrates and compares the water
activity and moisture content.

Kinetics of reaction in food complex systems:- This concept was largely brought
by D.B.Lund. The concept basically was evolved due to the rising importance of
not only the nutritional quality but also the flavor, color, and texture. For
understanding all these behaviors it is necessary to create mathematical models
w.r.t food complex systems and understand the reaction kinetics going inside the
food during processing. Three main criteria that are required to understand the
reaction kinetics is
i) Stoichiometry ii) Order and rate of reaction iii) Mechanism.
Momentum, Heat and Mass transfer in food:-.This concept was initially pioneered
and developed by Marcin Loncin. It was seen during the drying of foods that
sorption equations and Fourier and Fick laws are primarily utilized to describe the
transfer of heat and mass in the solid. Mathematically it was determined that the
total system should be balanced and a transfer phenomenon is the medium
through which phase changes occurs through forced or natural changes in
environmental conditions.
Mobile Refrigeration Systems:- The invention of mobile refrigeration in around
1949 revolutionized' the food industry and the methodology of engineering
problems especially w.r.t mathematical modeling during their shelf lives. It also
removed the problem of food spoilage during transportation. It became also a
good solution to replace ice which was causing pollution and sewage dumping.
The pioneer in these efforts was Sir Frederick Jones.

***************************************************
***************
List important conferences around the world
dealing with Food engineering.
Food Automation & Manufacturing Conference and Expo (FAM)
It is one of important conferences that deal in automation of processing lines via
implementing latest state of art robots, precision machine control. It also is a
community that evaluates return on investment (ROI) on plant automation
projects in food industry. Its latest upcoming event is on 3rd April, 2011 in Florida.
International Conference on Food Engineering and Biotechnology(ICFEB)
It is one of the conferences in food engineering that bring together scientist,
leading engineers, industry researchers and scholar students to share their
experiences and research results and challenges encountered during their
researches. It latest upcoming conference is at Tokyo, Japan on 25th May, 2011.
International congress on Engineering and Food(ICEF)
It is one of conferences that contribute to the solution of vital problems in world of
increasing problems pertaining to severe constraints of limited resources of raw

materials, energy and environment. Its latest conference will be held in Athens,
Greece on 22nd May, 2011.
International Microwave Power Institute(IMPI)
It is one of the conferences that include all the scientists, researchers, product
developers who are interested in microwave/RF technology. Its latest 45th annual
symposium will be held at Louisiana, USA on 8th Jun,2011.
Pacific Fisheries Technologists Conference (PFT)
It is a conference that broadens the professional network, discuss current
seafood issues and exchange information on current research in seafood
technology. Its latest conference was held in Vancouver, British Columbia,
Canada on Feb13th, 2011.
International Society of Food Engineering (ISFE)
It is a forum that facilitates communication among professionals working of food
engineering in academia, industry or government from around the world. It
conducts various conferences in food engineering and Ibero-American food
engineering conferences.

Please list and briefly describe 10 journals


dealing with food engineering topics.
Journal of Food Science: It was established in 1936.It is published by John Wiley
and sons on behalf of food technologists in Chicago, Illinois. The journal
publishes papers largely encircling food chemistry, food engineering, food
microbiology, nutrition and sensory analysis.
The International Journal of Food Science & Technology: It promotes new
techniques and knowledge in food industry to serve industrial researches
providing high quality referred original papers in food science and technology.
Journal of Agriculture and Food chemistry: It promotes the papers in agriculture
and food chemistry arena. These papers gives a greater understanding of the
role chemistry plays in the global economy, health, safety, and the environment
and is given the unique opportunity to exchange ideas with leading experts in the
field.
Journal of Food Engineering: This journal publishes papers in areas that are
interfacing between food and engineering, particularly that are relevant to
industry like engineering properties of food, measurement, control, packaging,
storage and distribution, engineering aspects of design and operation of food
processes. The current editor-in-chief is Dr. R. Paul Singh.
Advance Journal of Food Science and Technology: It is a journal that gives
experimental and theoretical research findings that have the potential for helping
the food industry to improve process efficiency, enhance product quality and,
extend shelf-life of fresh and processes food products.

Journal of Bioscience and Bioengineering: It is devoted to the advancement and


dissemination of knowledge concerning fermentation technology, biochemical
engineering, food technology and microbiology. The editor-in-chief is Hisao
Ohtake.
International Journal of Refrigeration: This journal has published special issues
on alternative refrigerants and novel topics in the field of boiling, condensation,
heat pumps, ice-slurry systems food refrigeration, carbon dioxide and magnetic
refrigeration at room temperature. The editor-in-chief is H.Auracher.
Journal of Food Process Engineering: It publishes the best original research on
applications of engineering principles and concepts to food and food processes.
The processes include any physical properties and changes to the food product
that result in preservation of the food, extending to transportation, product shelflife, or improvements in the product quality attributes.
Food and Bio products Engineering: It is only journal that exploits the synergy
between biotechnology, bioprocessing and food engineering. The developments
in plant or processes that can be given quantitative expression are encouraged.
The journal is especially interested in papers that extend the boundaries of food
and bio products processing.
Bio systems Engineering Journal: It is the research in engineering and the
physical sciences that represent advances in understanding or modeling of the
performance of biological systems for sustainable developments in land use and
the environment, agriculture and amenity, bio production processes and the food
chain. The papers generally report the results of experiments, theoretical
analyses, design of, or innovations relating to, machines and mechanization
systems, processes or processing methods, equipment and buildings,
experimental equipment, laboratory and analytical techniques and
instrumentation.
International Journal of Agricultural and Biological Engineering: It is a journal
which invites novel sustainable technologies in energy, agriculture and food
systems area. It is published jointly by US-based Association of Overseas
Chinese Agricultural, Biological and Food Engineers (AOCABFE) and Chinese
Society of Agricultural Engineering (CSAE). This journal provides a home for the
latest high-quality research concerning the agricultural, food and biological
engineering and the application of bio-agricultural engineering techniques in all
areas of agriculture. The journal features works of great significance, originality,
and relevance in all the concerned areas.

***************************************************
****************
Please list and briefly describe 15 books
dealing with food engineering topics.

Innovation in Food Engineering: New Techniques and Products: This book


consists of Application of Hybrid Technology Using Microwaves for Drying and
Extraction, Aseptic Packaging of Food--Basic Principles; and New Developments
Concerning Decontamination Methods for Packaging Materials. Latest
Developments and Future Trends in Food; Packaging and Biopackaging Uses of
Whole Cereals and Cereal Components for the Development of Functional
Foods. The author of the book is Maria Laura Passos and Claudio P Ribeiro.
Trends in Food Engineering: The book presents Structure- property relationships,
food rheology, and the correlations between physiochemical and sensory data. It
also presents developments in food processing such as minimal preservation and
thermal and non-thermal processing. Current topics such as biotechnology, food
additives, and functional property of proteins are basically discussed in three
separate sections. Specific topics include plastic materials for modified
atmosphere packaging, physical and microstructural properties of frozen
gelatinized starch suspensions, and vacuum impregnation in fruit processing. The
book is authored by Jorge E.Lozano
Encyclopedia of Food Engineering: - This book is authored by Care W. Hall, A. W.
Farrall and A. L. Rippen. It basically contains from aerobic reactions to convective
heat transfer in foods to soil dynamics, water vapor properties and much more. It
is exhaustive, clearly written, and meticulously detailed one stop shop for all hot
keywords in Food Engineering.
Introduction to Food Engineering:- It is authored by R.Paul Singh. It presents the
concepts and unit operations used in Food processing. It covers various
important topics like Supplemental Processes, Filtration, Sedimentation,
Centrifugation, Mixing, and Extrusion Processes for Foods, Packaging Concepts
and Shelf Life of Foods. Further, various emerging technologies such as high
pressure and pulsed, Electric field Design of plate heat exchangers, Impact of
fouling in heat transfer processes, Use of dimensional analysis in solving
problems also cover major portion of the book.
Advances in food engineering:- It contains most recent developments in Food
engineering. Some of them are like water relations of foods and their role in
dehydration processes; measurement of food properties and their role in
optimizing food processing operations; advances in heat and mass transfer and
their application in improving food processing operations; innovations in
equipment design and plant operations; structural and rheological considerations
in food processing; reaction kinetics in food processing, packaging, and storage;
and food engineering research and development in Indonesia. It is authored by
R.P.Singh & M A Wirakartakusumah
Food engineering research developments:-It includes but not limited to, the
application of agricultural engineering and chemical engineering principles to food
materials. It also covers Design of machinery and processes to produce foods,
Design and implementation of food safety and preservation measures in the
production of foods, Biotechnological processes of food production, Choice and
design of food packaging materials and Quality control of food production. It is
authored by Terrance P.Klening.

Food Engineering 2000:-It is authored by Pedro Fito Maupoei; Enrique OrtegaRodriguez; Gustavo V Barbosa-Canovas. It covers topics mainly like
Microstructural and Imaging Analyses, Rheological Properties of Fluid Foods,
Multilayer Sorption Isotherms, Structural Changes in the Minimal Processing of
Fruits and the study of fruits. It also covers the Study of Critical Variables in the
Treatment of Foods by Pulsed Electric Fields.
Solving problems in Food Engineering:- It is authored by S Yanniotis. This book
covers mathematical problems in fluid flow, heat transfer, mass transfer, and the
most common unit operations that have applications in food processing, such as
thermal processing, cooling and freezing, evaporation, and psychometrics and
drying. Hence, it is an excellent compilation of conceptual and quantitative Food
Engineering problems.
Unit operation in Food Engineering:-It is authored by Albert Ibarz; Gustavo V
Barbosa-Canovas. It covers various topics like molecular transport of
momentum, energy, and mass; rheology of food products; transport of fluid
through pipes; circulation of fluid through porous beds; filtration; separation
processes by membranes; thermal properties of food; heat transfer by
conduction; heat transfer by convection; heat transfer by radiation; thermal
processing of foods; food preservation by cooling; dehydration; distillation;
absorption; solid-liquid extraction; and adsorption and ionic exchange.
New Food Engineering research trends: It is authored by Alan P Urwaye. It
covers majorly new researches in the growing field of food engineering. Some of
topics included in book are Ionizing irradiation of foods, Fruits and vegetables
dehydration in tray dryers, Ultrasound in fruit processing, Protein hydrolysis with
enzyme recycle by membrane ultrafiltration and Far-infrared heating in paddy
drying process, It also consists of novel two-stage dynamic packaging for
respiring produce.
Food Engineering Principles and selected applications:- It consists of various
techniques employed in food industry. It also contains equations related to
transfer of mass, heat and momentum, equilibrium between phases, various
mechanical operations, applied biochemical kinetics, cleaning, disinfection and
rinsing. It is authored by Marcel Loncin and R.L.Merson.
Introduction to Food Process Engineering:- A basic introduction to the principles
of food process engineering, like physical properties of food, traditional fluid
mechanics, and heat transfer via conduction, convection, and radiation. It also
covers topics like transport phenomena, momentum, energy and mass balances,
as well as a new topic on macroscopic balances. It is authored by Albert Ibarz,
Gustavo V. Barbosa-Canovas.
Elements of Food Engineering:-It is authored by Ernest L Watson; John C Harper.
It majorly deals with production, preparation, processing, handling, packaging,
and distribution of food. The subject matter is treated from unit operations point of
view, i.e., materials, handling, cleaning, separating, disintegrating, pumping,
mixing, heat exchanging etc., to packaging.

Food Engineering Laboratory manual:- It is one of the manuals that basically


shows the approach on laboratory experiments; topics covered include safety,
preparing for a laboratory exercise, effectively performing an experiment, properly
documenting data, and preparation of laboratory reports. The chapters cover unit
operations centered on food applications: dehydration, thermal processing, and
friction losses in pipes, freezing, extrusion, evaporation, and physical
separations. It is authored by
Gustavo V Barbosa-Canovas; Li Ma; and Blas Barletta.
Automation for Food Engineering: food quality quantization and process control:It is authored by Yanbo Huang; A Dale Whittaker and Ronald E Lacey. It covers
concepts, methods, and theories of data acquisition and analysis, modeling,
classification and prediction, and control as they pertain to food quality
quantization and process control. It also covers the application of such advanced
methods as wavelet analysis and artificial neural networks to automated food
quality evaluation and introduces novel system prototypes such as machine
vision, elastography, and the electronic nose.

Article 3
Introduction to Wireless
Power Transfer
October 28, 2016 by Marie Christiano

Wireless Power Transfer holds the promise of freeing us


from the tyranny of power cords. This technology is being
incorporated into all kinds of devices and systems. Let's
take a look!

The Wired Way


The majority of today's residences and commercial buildings are powered by
alternating current (AC) from the power grid. Electrical stations generate AC
electricity that is delivered to homes and businesses via high-voltage transmission
lines and step-down transformers.
Electricity enters at the breaker box, and then electrical wiring delivers current to
the AC equipment and devices that we use every daylights, kitchen appliances,
chargers, and so forth.
All components are standardized and in agreement with the electrical code. Any
device rated for standard current and voltage will work in any of the millions of
outlets throughout the country. While standards differ between countries and
continents, within a given electrical system, any appropriately rated device will
work.
Here a cord, there a cord. . . . Most of our electrical devices have AC power cords.

Wireless Power Technology

Wireless Power Transfer (WPT) makes it possible to supply power through an air
gap, without the need for current-carrying wires. WPT can provide power from an
AC source to compatible batteries or devices without physical connectors or wires.
WPT can recharge mobile phones and tablets, drones, cars, even transportation
equipment. It may even be possible to wirelessly transmit power gathered by solarpanel arrays in space.
WPT has been an exciting development in consumer electronics, replacing wired
chargers. The 2017 Consumer Electronics Show will have many devices offering
WPT.
The concept of transferring power without wires, however, has been around since
the late 1890s. Nikola Tesla was able to light electric bulbs wirelessly at his
Colorado Springs Lab using electrodynamic induction (aka resonant inductive
coupling).

An image from Tesla's patent for an "apparatus for transmitting electrical energy,"
1907.

Three light bulbs placed 60 feet (18m) from the power source were lit, and the
demonstration was documented. Tesla had big plans and hoped that his Long
Island-based Wardenclyffe Tower would transmit electrical energy wirelessly
across the Atlantic Ocean. That never happened owing to various difficulties,
including funding and timing.
WPT uses fields created by charged particles to carry energy between transmitters
and receivers over an air gap. The air gap is bridged by converting the energy into
a form that can travel through the air. The energy is converted to an oscillating
field, transmitted over the air, and then converted into usable electrical current by a
receiver. Depending on the power and distance, energy can be effectively
transferred via an electric field, a magnetic field, or electromagnetic (EM) waves
such as radio waves, microwaves, or even light.
The following table lists the various WPT technologies as well as the form of power
transfer.

Technology
Inductive coupling

Energy Transfer
Magnetic fields

Enabling the Power Transfer


Coils of wire

Resonant inductive coupling


Capacitive coupling
Magnetodynamic coupling
Microwave radiation
Optical radiation

Magnetic fields
Electric fields
Magnetic fields
Microwaves
Light/infrared/ultraviolet

Resonant circuits
Conductive coupling plates
Rotating permanent magnets
Phased arrays/dishes
Lasers/photocells

WPT technologies.

Qi Charging, an Open Standard


for Wireless Charging
While some of the companies promising WPT are still working to deliver products,
Qi (pronounced "chee") charging is standardized, and devices are currently
available. The Wireless Power Consortium (WPC), established in 2008,
developed the Qi standard for battery charging. The standard supports both
inductive and resonant charging technologies.
Inductive charging has the energy passing between a transmitter and receiver coil
at close range. Inductive systems require the coils to be in close proximity and in
alignment with each other; usually the devices are in direct contact with the
charging pad. Resonant charging does not require careful alignment, and chargers
can detect and charge a device at distances up to 45mm; thus, resonant chargers
can be embedded in furniture or mounted in shelving.

The Qi logo displayed on the Qimini wireless charging plate. Image courtesy
of Tektos.

The presence of a Qi logo means the device is registered and certified by the
Wireless Power Consortium.
When first introduced, Qi charging was low power, about 5W. The first
smartphones using Qi charging were introduced in 2011. In 2015, Qi was
expanded to include 15W, which allows for quick charging.
The following graphic from Texas Instruments shows what the Qi standard covers.

Image courtesy of Kalyan Siddabattula and Texas Instruments (PDF).

Only devices listed in the Qi Registration Database are guaranteed to provide Qi


compatibility. There are currently over 700 products listed. It is important to
recognize that products with the Qi logo have been tested and certified; the
magnetic fields they use will not cause problems for sensitive devices such as
mobile phones or electronic passports. Registered devices are guaranteed to work
with all registered chargers.
For more information on Qi wireless charging, check out this article, and for an
introduction to and technical evaluation of Qi-compatible transmitter/receiver WPT
evaluation boards, click here and here.

The Physics of WPT


WPT for consumer devices is an emerging technology, but the underlying
principles and components are not new. Maxwell's Equations still rule wherever
electricity and magnetism are involved, and transmitters send energy to receivers
just as in other forms of wireless communication. WPT is different, though, in that
the primary goal is transferring the energy itself, rather than information encoded in
the energy.

WPT transmitter/receiver block diagram.

The electromagnetic fields involved in WPT can be quite strong, and human safety
has to be taken into account. Exposure to electromagnetic radiation can be a
concern, and there is also the possibility that the fields generated by WPT
transmitters could interfere with wearable or implanted medical devices.
The transmitters and receivers are embedded within WPT devices, as are the
batteries to be charged. The actual conversion circuitry will depend on the
technology used. In addition to the actual transfer of energy, the WPT system must
allow the transmitter and receiver to communicate. This ensures that a receiver

can notify the charging device when a battery is fully charged. Communication also
allows a transmitter to detect and identify a receiver, to adjust the amount of power
transmitted to the load, and to monitor conditions such as battery temperature.
The concept of near-field vs. far-field radiation is relevant to WPT. Transmission
techniques, the amount of power that can be transferred, and proximity
requirements are influenced by whether the system is utilizing near-field or far-field
radiation.
Locations for which the distance from the antenna is much less than one
wavelength are in the near field. The energy in the near field is nonradiative, and
the oscillating magnetic and electric fields are independent of each other.
Capacitive (electric) and inductive (magnetic) coupling can be used to transfer
power to a receiver located in the transmitter's near field.
Locations for which the distance from the antenna is greater than approximately
two wavelengths are in the far field. (A transition region exists between the near
field and far field.) Energy in the far field is in the form of typical electromagnetic
radiation. Far-field power transfer is also referred to as power beaming. Examples
of far-field transfer are systems that use high-power lasers or microwave radiation
to transfer energy over long distances.

Where WPT Works


All WPT technologies are currently under active research, much of it focused
on maximizing power transfer efficiency (PDF) and investigating techniques
for magnetic resonant coupling (PDF). In addition to the idea of walking into a
room equipped for WPT and having your devices charge automatically, much more
ambitious projects are in place.
Across the globe, electric buses are becoming the norm; London's iconic doubledecker buses are planning for wireless charging, as are bus systems in South
Korea, Utah, and Germany.
Using WiTricity, invented by MIT scientists, electric cars can be charged wirelessly,
and those cars can wirelessly charge your mobiles! (Using Qi charging, of course!)
This wireless technology is convenient, to be sure, but it may also charge
cars faster than plug-in charging can.

Graphic of a wireless parking charge setup built into a parking space. Image
courtesy of Toyota.

An experimental system for wirelessly powering drones has already been


demonstrated. And as mentioned above, ongoing research and development is
focused on the prospect of supplying some of Earth's energy needs using WPT in
conjunction with space-based solar panels.
WPT works everywhere!

Conclusion
While Tesla's dream of having power delivered wirelessly for everyone's use is still
far from feasible, many devices and systems are using some form of wireless
power transfer right now. From toothbrushes to mobile phones, from cars to public
transportation, there are many applications for wireless power transfer.

Article 4

An Overview of Quality by
Design Engineering article

Published: 23, March 2015

Since there is immense competition globally and growing impact of Information


technology, the pharmaceutical industry should need to improve its performance.
The industry should implement newer technologies that can effectively reduce
cost of production and at the same time improves product quality and regulatory
compliance. Quality by Design is a newer approach that has been offered by the
United States Food and Drug Administration (USFDA) which if understood well
and implemented properly can save considerable amount of time and cost and at
the same time can improve final product quality and regulatory compliance which
can increase the speed of product to reach in to the market. This article
discusses the background of quality by design concept, Building blocks of Quality
by Design, and its approach across the product life span and benefits that it
offers.

Introduction:
In 2002, the U.S. Food and Drug Administration (FDA) published a guidance
document for pharmaceutical companies on cGMP for the 21st century. This
guidance document expressed a strong desire that companies should build
quality, safety and efficacy in to their product. This concept is now known as
Quality by Design (QbD).

Till now, the meaning, benefits and impact of quality by design is confusing to
many people. Some says that it is a newer way to develop drugs, biologics and
devices; some says that it can shorten the production cycle; some says that it
provides more business flexibility but no one knows what it is exactly. Some
people do not even know that where, when and how should it be applied? Initially
there are so many companies who tried to adopt Quality by Design concept but
confusion gave way to frustration.

Background of Quality by Design:


Quality by design (QbD) is the concept first developed by the famous quality
expert named Joseph M. Juran in his 1992 book called "The New Steps for
Planning Quality in to Goods and Services". He believed that quality could be
planned in the very first stage of the production rather than final testing. The

concept was first used in automobile industry. There is one article published in
June 2007 titled "Elucidation: Lessons from the Auto Industry" says that Toyota
Automobiles was the first company who implemented many Quality by Design
concept to improve their automobiles in 1970s. That is why we can say that
Quality by Design concept is new only for FDA regulated industries and not for
other industries like technology, aeronautics, telecommunications etc. In other
words, we can say that the computer we use, the phone we answer, the airplane
we ride, the car we drive and the camera we use are all products of Quality by
Design but we cannot say that whatever tablet we ingest and whatever biologics
we use are the products of Quality by Design.
In 1990s, many of the medical device manufacturing company has implemented
Quality by Design aspect which resulted in reduced risk and manufacturing cost
and at the same time increased patient safety and product efficacy. From the
success of QbD aspect in medical device manufacturing, the FDA officials felt
that this concept has to be applied to drugs and biologics also. So, the internal
discussion in FDA started in late 1990s and finally they published a concept
paper in 2002 on cGMP in 21st century. With the huge help of some
pharmaceutical companies, pilot programs were started to share the Quality by
Design application and process understanding with the other companies.
"The FDA publication defined Quality by Design as:
Developing a product to meet predefined product quality, safety and
efficacy; and

Designing a manufacturing process to meet predefined product quality,


safety and efficacy."
But still it is one of the most misunderstood and misused tools available to many
pharmaceuticals as well as Medical devices company because FDA has just
published the paper along with its definition which clarifies FDA's approach, it left
far too many questions unanswered. The companies were left to implement
Quality by Design concept on its own. John Avellanet, a consultant from cerulean
associations LLC says that "when Quality by Design is planned and implemented
properly, the benefits are enormous. But if Quality by Design is tackled
haphazardly, the benefits fizzle."

Pharmaceutical Quality by Design:

Quality in pharmaceuticals is very much important since it directly deals with


patient's health and so Food and Drug Administration (FDA) has set stringent law
for drug approval. It is a U.S. agency that has power to approve or reject the drug
product, biological or medical devices in order to set the Americans free from risk.
Along with the dosage forms, it is also concerned about the drug development
process e.g. how it is manufactured and purity of the condition under which it is
manufactured.
In order to produce quality product consistently, FDA suggests the
pharmaceutical Industries to implement Quality by Design (QbD) concept.
Pharmaceutical QbD is FDA's one of
the two systemic, holistic and risk based approach to pharmaceutical
development. The other one is PAT (Process Analytical Technology). If QbD
explains "what to do," then PAT is a framework for "how to do".
QbD is overarching philosophy articulated in both the cGMP regulations and in
robust modern quality system. The principle of QbD is "Quality should be built in
design", the testing alone cannot give surety on product quality.
It means that designing the whole drug development and manufacturing process
in such a way that produces product with pre defined quality objective. QbD
identifies characteristics that are critical to quality from the perspective of
patients, translates them into the attributes that the drug product should possess,
and establishes how the critical process parameters can be varied to consistently
produce a drug product with the desired characteristics[2]. In order to achieve
this, the relationships between the formulations and manufacturing process
variables and product characteristics are thoroughly understood and source of
variability should be identified.
This knowledge and skills are then used to implement flexible and robust
manufacturing process that can adapt and produce consistent quality product
over a period of time[2]. Thus some of the important QbD features include:
Define Product quality profile

Design and develop Manufacturing Processes

Identify and control the critical control parameters, Critical quality attributes
and source of variability.

Control the whole manufacturing process to produce quality product


consistently over a period of time.

Quality by testing vs Quality by Design (QbT vs QbD)


Quality by testing is the approach that most of the pharmaceutical organizations
are using currently. Some of the companies has replaced QbT concept to QbD to
produce a quality product consistently. In Quality by Testing, the final product is
tested to get assurance that particular batch product is within the specification
and is of highest standard quality.
The recent approach is QbT where if drug substance and excipients meet the
specification the next step of unit operation is carried out such as Mixing,
blending, drying, compression, coating etc. with fixed process parameters. If the
materials do not meet the In Process Specifications then the material is
discarded. If it passes, then an assay is performed where the %
of purity, Dissolution, Disintegration, Moisture content is measured. In those
cases, the acceptance criteria based on one or more of the batch data.
Finally, if product fails to meet the finished product specification requirement, it is
discarded. This is how pharmaceuticals are manufactured using QbT approach.
In Quality by Design, Consistency in quality product manufacturing comes from
the designed and control of the process. The next step is not performed until the
step before that gets pass. If the first step fails then the root cause of the failure is
investigated and understood to fix it in order to move on to subsequent steps.

Building Blocks of QbD


The first step to implement Quality by Design is to understand critical output of
QbD and after that identify critical building blocks of QbD such as improving
process understanding and risk associated with it.
Critical Quality Attributes (CQAs): The critical process output
measurements linked to patient needs [3].

Critical Process Parameters (CPPs): The process inputs (API,


Excipients), process control and environmental factors that have major
effects on the Critical Quality Attributes (CQAs) [3].

Design Space: The combination of input variables and process


parameters that provide quality assurance [3].

Failure Mode and Effects Analysis (FMEA): It examines raw material


variables, Identifies how a process can fail and the areas of process that
remains at greatest risk of failing [3].

Process model: A quantitative picture of process based on fundamental


and statistical relationship that predict the critical quality attribute (CQA)
result [3].

Process Capability: It tracks process performance relative to CQA


specification and provide measurement repeatability and reproducibility
regarding CQAs [3].

Process Robustness: The ability of process to perform when faced with


uncontrolled variation in process, input and environmental variables [3].

Process Control: It is a control procedures including statistical process


control that keeps the processes and measurement system on target and
within desired variations [3].

Raw material factors: It includes the stability and capability of raw


material manufacturing processes that affect process robustness, process
capability and process stability [3].

Risk level: It is a function of the design space, FMEA result and process
and measurement capability, control and robustness [3].

The output of process control, design space and risk are consistent with this
approach. The building blocks needs to be assembled as mentioned below
before results can be realized.
Identify Critical Quality Attributes

Characterize raw material variation

Identify Critical Process Parameters

Characterize design Space

Ensure process capability, process control and process robustness

Create process model monitoring and maintenance

Offers risk analysis and management

Quality by Design across the Product Lifespan:


1. Development Stage:
New drug development stage is riskiest and costliest stage of the drug and
biologic life span. If Quality by Design concept is well understood and well
applied, it provides most powerful results such as reducing time, cost, risk and
efforts. John Avellanet said that "Quality by Design is a strategic and systemic
approach to get the new product pipeline to market faster, easier and for less."
Preclinical:
Quality by Design improves product development if we use our prior knowledge
from the previous experience, previous product or from literature surveys. We
can identify and specify the characteristics that our new product must possess
from the previous experience and customer needs. Nonclinical:
To meet the pre-determined specifications, a company must conduct preclinical
as well as nonclinical experiment to verify the ability of the product being
developed to meet the targets. In other words, a company should carry out in vivo
and in vitro tests and depending on the product; the amount of active molecule in
serum has been drawn from the feasibility experiment.
Clinical:
The clinical studies are confirmatory if we apply Quality by Design concept during
drug development. A company can use the traditional approach to the clinical

trials or try adaptive trial. During the drug development process, when the product
reaches to the phase III trial stage a company must focus only on the micro
refinements to their process as well as their manufacturing process.
Scale-UP:
The scale-up is defined as conversion of an industrial process from a pilot plant
or a small laboratory set up to a large commercial manufacturing. It is also a part
of Quality by Design. Application of QbD during scale up allows us to document
changes and rationale during conversion from pilot plant manufacturing to full
scale manufacturing.
For example, imagine that Stevens Pharmaceuticals limited is actively engaged
in chlorpromazine tablet manufacturing. The pilot model was successful. Now the
company wants to switch the pilot model to a large commercial manufacturing.
During the coating of the tablet the company need increased nozzle size on a
sprayer so that they can meet the higher spray rate for faster manufacturing. As
long as the larger nozzle size maintained the
same droplet size as in pilot production, then no further testing and validation
would be required. Such information is then documented and attached in the final
submission for market approval.
Submissions for market approval:
"Submissions based on QbD have more scientific information on product,
process and controls which allows faster reviews" According to FDA's own
internal analysis, Quality by Design based applications are processed 63% faster
than traditional submissions [3].

2. Manufacturing:
When Quality by Design concept applied to drugs and biologics manufacturing, it
offers more business flexibility. Once upon a time, it was quite a bit difficult for the
companies if they want to modify their manufacturing process. In those cases,
they have to wait for the regulatory approval prior to implementing changes. But
now, under QbD, this review can be eliminated by relying on design space,
Process Analytical Technology and 'Real Time' quality control.
Design Space:
Product manufacturing processes that do not impact final product quality, its
safety and efficacy are called "design space" As per the ICH guidelines the
design space is "the multi dimensional combination and interaction of input
variables (e.g. material attributes) and process parameters that have been

demonstrated to provide assurance of quality". "Design Space consists of the set


of all values and combination of the controllable factors that are predicted to yield
all of the output quality attributes within their allowable ranges with a sufficient
high level of assurance"[4].
Knowledge Space
Movement within design space does not considered a change and so it does not
require regulatory review but movement out of the design space is considered
change and requires regulatory review or approval. The more we know about the
impact of the process on the product's final quality and safety, the more flexibility
a company can have under quality by design.
Process Analytical Technology (PAT):
It is very difficult to predict the effect of process change on final product. An
essential part of quality by design accepts that even if the effect of process
change cannot be predicted, it can be fully monitored and controlled. Process
analytical technology allows us to continuously monitor, test, analyze and adjust
whole manufacturing process to increase control and improve efficiency through
the measurement of critical process parameters (CPP) which affects critical
quality attributes (CQA).
"REAL-TIME" Quality Control:
The third aspect of Quality by Design in the manufacturing arena is the ability to
shift quality control upstream in to production. By this way we can reduces the
waste and the cost of producing a batch that ultimately fail the quality control. By
embedding quality control checks throughout manufacturing process, Quality by
Design allows us to increase our production, improve our product and streamline
the whole process.

3. Control Strategies:
Embedding quality control checks throughout manufacturing process is one of the
control strategies that helps ensure production quality. We all know that Quality
by Design is simply designing and developing the product and manufacturing
process in order to get predefined product quality, safety and efficacy. If we link
the product design and development stage directly with process development
then it gives us the degree of control required.
Continuous improvement:
If we remember back on the definition of Quality by Design as "everything we do
to directly promote and prove safety, efficacy and quality of our product," then

continuous improvement is a part of promoting and proving safety, efficacy and


quality of our product. By continuous improvement, we can focus on making the
whole manufacturing process efficient without negatively impacting the product.
Since QbD facilitates continuous improvement in product quality it increases the
regulatory flexibility.
Product Performance:
Key to successful implementation of Quality by Design is to identify Critical
Process Parameters (CPP) and Critical Quality Attributes (CQA) that are critical
to safety, efficacy and Quality of the product. If we can prove that drug excipients
like food color or sweetening agent has no impact on safety, efficacy and quality
of final drug product then we can decide not to bother about testing of those
materials. Similarly, those processes that have no impact on product safety,
efficacy and quality can also receive minimal attention, testing and control. This
undoubtedly reduces the costs involved in product development and its
production.
Benefits of Quality by Design:
After fully implementation of QbD, one can get assurance that all the critical
sources of process variability have been identified, measured and understood so
that they can be controlled by the manufacturing process itself. That is why the
benefits of QbD are significant. Such as,
Reduces batch failure rates

Reduces final product testing

Reduces batch release cost

Reduces operation cost from fewer failures

Increased capability to meet both customer and regulatory needs

Reduced raw material and finished product inventory costs

Faster regulatory approval of NDA and process changes

Fewer and shorter regulatory inspection of production site

facilitates easier management of technical change

Maximize profit by increasing purchase intent


These benefits translate in to significant reduction in capital requirement,
resource cost and time to value. That is why it is said that it is most
misunderstood and misused tools available to pharmaceutical industry but if
understood thoroughly and implemented properly, the benefits are enormous.

Conclusion:
Pharmaceutical quality by design is a systemic approach to the pharmaceutical
development which begins with predefined quality objectives. QbD is about using
correct tool for specific job. It is a mind-set and not a process. QbD works for any
process and does not require a 'project'. The only reason why most organization
are still thinking about QbD rather than implementing is "Too many other things to
do". But if understood and implemented well then it enhances and modernize the
regulation of pharmaceutical manufacturing and product quality at the same time
offers immense benefits. The results shows that companies who adopt QbD can
expect significantly reduced risk of costly deviation and rejects. It also reduces
the time required by the FDA to review the NDA submissions 63 % faster. Since it
is a FDA's 21st century's risk based approach, any company if understand and

implement QbD correctly can build five star quality product and make FDA happy.
"If a screw is loose ------ Tight it ------ Don't rebuild the whole house!"

Article 5

International Journal of Software


Engineering Information
Technology Essay
Published: 23, March 2015

Copying a code fragment and reusing the same in several parts of the
project is a common phenomenon in software project development.

Statistics says about 5 to 17 of the system will contain duplicated


code. Major disadvantage with the code duplication with or without
minor modifications is debugging. A bug detected in a code fragment
will be repeated in all its clones and debugger must investigate all
these fragments for the same bug, refactoring of the repeated code is
another issue in software maintenance. In this paper we survey the
status of the clone detection research. First, we introduce few
concepts and terms involved in code cloning, and categorization of
clones. Second, we discussed few existing clone detection tools and
their comparisons. Finally, this paper concludes with the problems
related to clone detection research.

1. Introduction
Copying code fragments and then reuse by
pasting with or without minor modifications or adaptations are
common activities in software development. This type of reuse
approach of existing code is called code cloning and the pasted code
fragment with or without modifications is called a clone of the original
[1]. It is not easy to find which is clone and which is original after the
development phase. It was found that the maintenance of the systems
with clones is more difficult [12] [3].
Clones are usually results of copy paste activity. Such activities are
known as software reuse and it reduces programming efforts and
time. This kind of practice is common when algorithms are similar and
often it happens in operating systems development. There is also
accidental cloning, which
is not the result of direct copy and paste activities but
by using the same set of APIs to implement similar protocols [7].
1.1 Advantages of code duplication
Clones are also introduced in the systems to obtain several
maintenance benefits.

Clones do frequently occur in financial software as there are frequent


updates/enhancements of the existing system to support similar kinds
of new functionalities. Financial products do not change that much
from the existing one, especially within the
same financial institutions. The developer is often
asked to reuse the existing code by copying and
adapting to the new product requirements because of the high risk [4].
To keep software architecture clean and understandable, sometimes
clones are intentionally introduced to the system [5].
As two cloned code fragments are independent of each other both
syntactically and semantically, they can evolve at different paces in
isolation without affecting the other and testing is only required to the
modified fragment. Keeping cloned fragments in the system may thus
speed up maintenance, especially when automated regression tests
are absent [6].
Cloning/redundancy is incorporated intentionally while designing lifecritical systems. Often the same functionality is developed by different
teams in order to reduce the probability that the implementations fail
under the same circumstances.
In real time programs, function calls may be deemed too costly. If the
compiler does not offer to inline the code automatically, this will have
to be done by hand and consequently there will be clones [1].
1.2 Problems with code duplication
The factors behind cloning described in Section 2 are reasonable and
consequently, clones do occur in large software systems. While it is
beneficial to practice cloning, code clones can have severe impacts
on the quality, reusability and maintainability of a software
International Journal of Software Engineering Research & Practices
Vol.2, Issue 1, Jan, 2012

system. In the following we list some of the drawbacks of having


cloned code in a system. Increased probability of bug propagation: If a
code segment contains a bug and that segment is reused by coping
and pasting without or with minor adaptations, the bug
of the original segment may remain in all the pasted segments in the
system and therefore, the probability of bug propagation may increase
significantly in the system
[12, 8].
Increased probability of introducing a new bug: In many cases, only
the structure of the duplicated fragment is reused with the developer's
responsibility of adapting the code to the current need. This process
can be error prone and may introduce new bugs in the system [9, 10].
Increased probability of bad design: Cloning may also introduce bad
design, lack of good inheritance structure or abstraction.
Consequently, it becomes difficult to reuse part of the implementation
in future projects. It also badly impacts on the maintainability of the
software [11].

2. Categorization of clones and


clone terms
The Clones in general are classified under 4
categories. The first two may be detected through the similarities
found in the program text that has been copied. They may be defined
as
Type 1 is an exact copy without modifications (i.e. except for white
space and comments).
Type 2 is a syntactically identical copy; only variable, type, or function
identifiers vary.

The results of the code clone detection are usually given as clone
pairs/clone clusters along with their location/occurrence.
Type 3 is copy with further modifications (i.e. a new statement can be
added, or some statements can be removed)
Type 4 clones are the results of semantic similarity between two or
more code fragments.
Clone Pair (CP): pair of code portions or fragments which are identical
or similar to each other.
Clone Cluster (CC): the union of all clone pairs which have code
portions in common [2].
Exact clones: Two or more code fragments are called exact clones if
they are identical to each other with some differences in comments
and white space or layout (type 1 clones).
Renamed clones: People use the term renamed clones when
identifier names, literals values,
comments or white space changes in the copied fragments (type 2
clones).
Parameterized Clones: A parameterized clone or p- match clone is a
renamed clone with systematic renaming.
Near-Miss Clones: Near-miss clones are those clones where the
copied fragments are very similar to the original. Editing activities
such as changing in comments, layouts, changing the position of the
source code elements through blanks and new lines, changing the
identifiers, literals, macros may have been applied in such clones
which actually imply that all parameterized and renamed clones are
near-miss clones.
Structural Clones: Software components can be compared with
various degrees of accuracy. Structural similarity reflects the degree to
which the software specifications looks alike, i.e., have similar design
structures.

3. Existing techniques and tools


First, some detectors are based on lexical
analysis. For instance, Baker's Dup [14] uses a sequence of lines as a
representation of source code and detects line-by-line clones.
Therefore, it uses a lexer and a line-based string matching algorithm
on the tokens of the individual lines. Dup removes tabs, white space
and comments; replaces identifiers of functions, variables, and types
with a special parameter; concatenates all lines to be analyzed into a
single text line hashes each line for comparison; and extracts a set of
pairs of longest matches using a suffix tree algorithm.
Kodhai.E [13] proposed combination of textual and metric analysis of
a source code for the detection of all types of clone in a given set of
fragment of java source code. Various semantics had been formulated
and their values were used during the detection process. That metrics
with textual analysis provides less complexity in finding the clones and
gave accurate results.
Another pure text-based approach is Johnson's [12] redundancy
(exact repetitions of text) finding mechanism using fingerprints on a
substring of the source code. In this algorithm, signatures calculated
per line are compared in order to identify matched substrings.
Karp-Rabin fingerprinting algorithm [15] is used for calculating the
fingerprints of all length n substrings of a text. First, a text-to-text
transformation is performed on the considered source file for
discarding the uninterested characters. Following this
2
International Journal of Software Engineering Research & Practices
Vol.2, Issue 1, Jan, 2012
the entire text is subdivided to a set of substrings so that every
character of the text appears in at least one substring. After that the
matching substrings are identified. In that stage, a further
transformation is applied on the raw matches to obtain better results.
Stephane Ducasse [16] investigated a number of simple variants of

string-based clone detection that normalize differences due to


common editing operations, and assessed the quality of clone
detection for very different case studies. Their results confirmed that
the inexpensive clone detection technique generally achieved high
recall and acceptable precision.
Overzealous normalization of the code before comparison, however,
could result in an unacceptable numbers of false positives
Christopher Brown et al [17] presented a technique for the detection
and removal of duplicated Haskell code. The system was
implemented within the refactoring framework of the Haskell
Refactorer (HaRe), and used an Abstract Syntax Tree (AST) based
approach. Detection of duplicate code was automatic, while
elimination was semi-automatic, with the user managing the clone
removal. After presenting the system, an example was given to show
how it has worked in practice.

In the approach by Koschke et al. [18], the AST nodes are serialized
in preorder traversal, a suffix tree is created for these serialized AST
nodes, and the resulting maximally long AST node sequences are
then cut according to their syntactic region so that only syntactically
closed sequences remain. In stead of comparing the AST nodes, their
approach compares the tokens of the AST-nodes using a suffix treebased algorithm and therefore, this approach can find clones in linear
time and space, a significant improvement to usual AST-based
approaches.
Davey et al. [19] detects exact, parameterized and near-miss clones
by first computing certain features of code blocks and then using
neural networks to find similar blocks based on their features.
Table .1

Method
Precision
Recall
Line-based
100%, No
False positives as checks for exact copies
Low, only
Finds exact copies
Parameterized
Line based
Medium, may
return false positives
Medium, can
Detect only exact and parameterized clones
Token-based
Low, due to
Normalization and/or transformation returns many false
Positives
High, can
Detect
most clones

Parse-tree
based
High, parsetree considers structural info also
Low, cannot
detect all types of clones, however, several approaches
taken to
overcome it
PDG-based
High,
considers structural and semantic info too
Medium,
cannot detect all clones
Metrics-based
Medium, two
code blocks with similar metrics values may not be same
Medium,
cannot detect many clones
AST + Suffix
tree
High,
considers structural info

Low, cannot
detect
all clones

Evolution of clone detection


approaches
Here in this paper we provide a higher level comparison of the
detection approaches in terms of precision and recall based on our
study in the Table1. We see that text-based techniques (line-based
and parameterized line-based in the table) are easily adaptable to
different languages of interest as such techniques need at most lexers
of these languages.
3
International Journal of Software Engineering Research & Practices
Vol.2, Issue 1, Jan, 2012
Precision of this approach is also very good as clones are determined
by textual similarity. However, this approach cannot detect many
clones of the system that are produced with some of the editing
activities. The scalability of this approach on the other hand depends
on the comparison algorithm and optimizations used.
Token-based techniques are bit language dependent as they need at
least a lexical analyzer for the language of interest. Although, their
precision is low (return many false positives due to the normalization
and application of transformation rules) and the recall of these
techniques are high. They can detect most clones and using suffix
tree algorithm they are scalable enough to tackle token sequences of
large systems.
Although parse-tree based techniques are not good while considering
the recall, they are good at detecting real clones with high precision.
As these techniques focus on the structural property of the source
code, very few false positives are returned. Similarly, PDG-based

techniques focus on both the structural, data and control flow


properties of the source code fragments and therefore, return very few
false positives. However, they are mostly language- dependent
(needs PDG generator) and not scalable (Graph matching is too
costly). Metrics-based techniques, on the other hand, are good
enough both in terms of precision and recall with high scalability.
However, these techniques also need to generate AST/PDG from the
source and therefore, heavily language dependent.
The hybrid approach of using suffix trees on ASTs is as scalable as
the token-based techniques while providing high precision. However,
its portability and recall are still the same as the usual tree-based
techniques.

Conclusion
Clone detection is an active research area and the literature is
overwhelmed with plenty of work in
Detecting and removing clones from software systems. Research is
also done in maintaining clones in software systems in its evolution
life cycle. In this paper, a comprehensive survey on the area of
software clone detection research is made putting
Emphasis on the types of clones, mechanism used and
Corresponding tools. Several issues are also pointed out for further
research. Reports by various researchers are the source for an
overview of clone detection research. The results of this study may
serve as a roadmap to potential users of clone detection techniques,
to help them in selecting the
Right tool or technique for their interests. We hope it may also assist
in identifying remaining open research questions, possible avenues
for future research, and interesting combinations of existing
techniques.

References
://pharmtech.findpharma.com/pharmtech/Special+Section:
+Quality+by+Design/Building- a- Framework-for-Quality-byDesign/ArticleStandard/Article/detail/632988
[4] Watmough, Peter, & Morris, Laura. (2009, September 17). Implementation of
quality by design in new product development. Retrieved from
http://www.icheme.org/pdfs/17Sept09WatmoughandMorris.pdf
[5] Avellanet, John. (2008, March). Why Quality by design? an executive's guide
to
the fda's quality by design. Retrieved from

http://www.ceruleanllc.com/Downloads_private/Cerulean_QbD_Executive_Guide.
pdf
[6] Neway, Justin. (2007). Achieving manufacturing process excellence with
quality
by design, design space development, design of manufacturing and pat.
http://www.pharmaqbd.com/files/articles/QbD_Aegis_Pharmaceutical_Quality_
by%20_Design_V3_4JN.pdf
[7] http://www.palgrave-journals.com/jgm/journal/v4/n4/full/4950073a.html
[8] http://www.ngpharma.com/article/Quality-by-Design-is-Essential-in-the-NewUS-Regulatory-Environment/
[9] A change of course for pharmaceutical manufacturing.
http://www.healthcarepackaging.com/archives/2009/06/a_change_of_course_f
or_pharmac.php
[10]Quality by Design is breathing new life into quality systems and process
analytical technology.
http://www.pharmamanufacturing.com/articles/2007/172.html
[11] Discoverant enables quality by Design.
http://www.aegiscorp.com/industries/pharmaceutical-and-biotechnology-/lifesciences-trends- /quality-by-design.html
[12] A change of course for pharmaceutical manufacturing.
http://www.pharmainfo.net/reviews/total-quality-management-promisingfixation-accomplish-zero-defects

Das könnte Ihnen auch gefallen