当前位置: 首页 > 资源下载 > 编程语言 > 查看资源

资源分类
Web开发
Java
.NET
编程语言
数据库
软件工程
图形动画
系统管理
网络通信安全
计算机理论
考试认证
人文百科
文档手册
硬件技术
办公软件

Memory Management: Algorithms and Implementation in C/C++

Memory Management: Algorithms and Implementation in C/C++

书名:Memory Management: Algorithms and Implementation in C/C++

上传:石头

时间:2009-11-23

文件大小:5.47 MB

资源出处:查看资源出处 >>

收藏到网摘: n/a



作者:Bill Blunden
出版社:其它
ISBN:1556223471
文件格式:CHM

“Pay no attention to the man behind the curtain.”
—The Wizard of Oz
There are a multitude of academic computer science texts thatdiscuss memory management. They typically devote a chapter or less tothe subject and then move on. Rarely are concrete, machine-leveldetails provided, and actual source code is even scarcer. When theauthor is done with his whirlwind tour, the reader tends to have a verylimited idea about what is happening behind the curtain. This is nosurprise, given that the nature of the discussion is rampantlyambiguous. Imagine trying to appreciate Beethoven by having someoneread the sheet music to you or experience the Mona Lisa by reading adescription in a guidebook.
This book is different. Very different.
In this book, I am going to pull the curtain back and let you seethe little man operating the switches and pulleys. You may be excitedby what you see, or you may feel sorry that you decided to look. But asEnrico Fermi would agree, knowledge is always better than ignorance.
This book provides an in-depth look at memory subsystems and offersextensive source code examples. In cases where I do not have access tosource code (i.e., Windows), I offer advice on how to gather forensicevidence, which will nurture insight. While some books only givereaders a peak under the hood, this book will give readers a powerdrill and allow them to rip out the transmission. The idea behind thisis to allow readers to step into the garage and get their hands dirty.
My own experience with memory managers began back in the late 1980swhen Borland’s nifty Turbo C 1.0 compiler was released. This was myfirst taste of the C language. I can remember using a disassembler toreverse engineer library code in an attempt to see how the malloc() andfree() standard library functions operated. I don’t know how manyschool nights I spent staring at an 80×25 monochrome screen,deciphering hex dumps. It was tough going and not horribly rewarding(but I was curious, and I couldn’t help myself). Fortunately, I havedone most of the dirty work for you. You will conveniently be able tosidestep all of the hurdles and tedious manual labor that confronted me.
If you were like me and enjoyed taking your toys apart when you werea child to see how they worked, then this is the book for you. So layyour computer on a tarpaulin, break out your compilers, and grab an oilrag. We’re going to take apart memory management subsystems and putthem back together. Let the dust fly where it may!
Historical Setting
In the late 1930s, a group of scholars arrived at Bletchley Park in anattempt to break the Nazis’ famous Enigma cipher. This group ofcodebreakers included a number of notable thinkers, like Tommy Flowersand Alan Turing. As a result of the effort to crack Enigma, the firstelectronic computer was constructed in 1943. It was named Colossus andused thermionic valves (known today as vacuum tubes) for storing data.Other vacuum tube computers followed. For example, ENIAC (electronicnumerical integrator and computer) was built by the U.S. Army in 1945to compute ballistic firing tables.
Note Science fiction aficionados might enjoy a movie calledColossus: The Forbin Project. It was made in 1969 and centers aroundColossus, a supercomputer designed by a scientist named Charles Forbin.Forbin convinces the military that they should give control of the U.S.nuclear arsenal to Colossus in order to eliminate the potential ofhuman error accidentally starting World War III. The movie is similarin spirit to Stanley Kubrick’s 2001: A Space Odyssey, but without thehappy ending: Robot is built, robot becomes sentient, robot runs amok.I was told that everyone who has ever worked at Control Data has seenthis movie.
The next earth-shaking development arrived in 1949 when ferrite(iron) core memory was invented. Each bit of memory was made of asmall, circular iron magnet. The value of the bit switched from “1″ to“0″ by using electrical wires to magnetize the circular loops in one oftwo possible directions. The first computer to utilize ferrite corememory was IBM’s 705, which was put into production in 1955. Back inthose days, 8KB of memory was considered a huge piece of real estate.
Everything changed once transistors became the standard way to storebits. The transistor was presented to the world in 1948 when Bell Labsdecided to go public with its new device. In 1954, Bell Labsconstructed the first transistor-based computer. It was named TRADIC(TRAnsistorized DIgital Computer). TRADIC was much smaller and moreefficient than vacuum tube computers. For example, ENIAC required 1,000square feet and caused power outages in Philadelphia when it was turnedon. TRADIC, on the other hand, was roughly three cubic feet in size andran on 100 watts of electricity.
Note Before electronic computers became a feasible alternative,heavy mathematical computation relied on human computers. Large groupsof people would be assembled to carry out massive numerical algorithms.Each person would do a part of a computation and pass it on to someoneelse. This accounts for the prevalance of logarithm tables inmathematical references like the one published by the Chemical RubberCompany (CRC). Slide rules and math tables were standard fare beforethe rise of the digital calculator.
ASIDE
“After 45 minutes or so, we’ll see that the results are obvious.”
—David M. Lee
I have heard Nobel laureates in physics, like Dave Lee, complainthat students who rely too heavily on calculators lose theirmathematical intuition. To an extent, Dave is correct. Before the dawnof calculators, errors were more common, and developing a feel fornumeric techniques was a useful way to help catch errors when theyoccurred.
During the Los Alamos project, a scientist named Dick Feynman ran amassive human computer. He once mentioned that the performance andaccuracy of his group’s computations were often more a function of hisability to motivate people. He would sometimes assemble people intoteams and have them compete against each other. Not only was this agood idea from the standpoint of making things more interesting, but itwas also an effective technique for catching discrepancies.
In 1958, the first integrated circuit was invented. The inventor wasa fellow named Jack Kilby, who was hanging out in the basement of TexasInstruments one summer while everyone else was on vacation. A littleover a decade later, in 1969, Intel came out with a 1 kilobit memorychip. After that, things really took off. By 1999, I was working on aWindows NT 4.0 workstation (service pack 3) that had 2GB of SDRAMmemory.
The general trend you should be able to glean from the previousdiscussion is that memory components have solved performancerequirements by getting smaller, faster, and cheaper. The hardwarepeople have been able to have their cake and eat it too. However, thelaws of physics place a limit on how small and how fast we can actuallymake electronic components. Eventually, nature itself will stand in theway of advancement. Heisenberg’s Uncertainty Principle, shown below, iswhat prevents us from building infinitely small components.
ΔxΔ p ≥ (h/4π)
For those who are math-phobic, I will use Heinsenberg’s own words to describe what this equation means:
“The more precisely the position is determined, the less precisely the momentum is known in this instant, and vice versa.”
In other words, if you know exactly where a particle is, then youwill not be able to contain it because its momentum will be huge. Thinkof this like trying to catch a tomato seed. Every time you try tosqueeze down and catch it, the seed shoots out of your hands and fliesacross the dinner table into Uncle Don’s face.
Einstein’s General Theory of Relativity is what keeps us frombuilding infinitely fast components. With the exception of black holes,the speed limit in this universe is 3×108 meters per second.Eventually, these two physical limits are going to creep up on us.
When this happens, the hardware industry will have to either makelarger chips (in an effort to fit more transistors in a given area) oruse more efficient algorithms so that they can make better use ofexisting space. My guess is that relying on better algorithms will bethe cheaper option. This is particularly true with regard to memorymanagement. Memory manipulation is so frequent and crucial toperformance that designing better memory management subsystems willtake center stage in the future. This will make the time spent readingthis book a good investment.
Tags:AlgorithmsManagementMemory


相关书籍

  • Agile Portfolio Management (CHM英文版)
  • Applied Software Risk Management: A Guide for Software Project Managers
  • Mastering phpMyAdmin 2.11 for Effective MySQL Management (PDF 英文版)
  • Agile Portfolio Management (CHM英文版)
  • Mastering phpMyAdmin 2.11 for Effective MySQL Management (PDF 英文版)

评论 (0) All

登陆 | 还没注册?