How To Make a Music Visualizer in iOS

 Xinrong Guo on June 4, 2013

Learn how to create your own music visualizer!

In the mid-seventies, Atari released the Atari Home Music player that connected a television to a stereo and thereby produced abstract images in sync with the music. Consumers could manipulate the images by twisting knobs and pushing buttons on the device.

The device was a market failure but it was the first time that the world was exposed to music visualization. Now, music visualization is a common technology that can be found in almost every digital media player such as iTunes or Windows Media Player.

To see an example of music visualization in action, simply launch iTunes, start a good tune, then choose View/Show Visualizer and allow the psychedelics to free your mind! :]

In this tutorial, you’ll create your very own music visualizer. You’ll learn how to configure the project to play music as well as support background audio and to create particle effects using UIKit’s particle system. You’ll also learn how to make those particles dance to the beat of a song.

So cue up the music and break out the disco ball, things are about to get visual!

Note: You can try out most of the tutorial using the iPhone Simulator, but you will need to run the project on a device to select different songs and to play the music in the background.

Starter project

To start things off, download this starter project. The starter project has the following functionality:

  1. It provides a simple user interface for the application.
  2. The supported interface orientation is set to landscape.
  3. The MediaPlayer.framework has been added to the project.
  4. It contains a method which allows you to pick songs from your iPod library.
  5. An image named particleTexture.png was added to the project for use by the particle system.
  6. The MeterTable.h and MeterTable.cpp C++ files were also added to the project. These were taken from the Apple sample project avTouch, and will be explained later on in this tutorial.

First, extract the downloaded project, open it in Xcode, and build and run. You should see the following:

You can tap the play button to switch between play and pause modes but you won’t hear any music until after you’ve added some code. Tap on the black area in the middle to hide/show the navigation bar and tool bar.

If you’re running in the iPhone Simulator and tap the magnifying glass icon on the bottom left, you’ll see the following warning:

This is because the iPhone Simulator doesn’t support accessing the music library. But if you are running on a device, a tap on that icon will make the media picker appear, so that you can choose a song.

Once you are familiar with the user interface, let’s get started.

Let the Music Play

Using AVAudioPlayer is an easy way to play music on an iOS device. AVAudioPlayer can be found in theAVFoundation.framework, so you need to add this framework to your project.

Note: If you are interested in learning more about the AVAudioPlayer class and what it can do, take a look at our Audio 101 for iPhone Developers: Playing Audio Programatically tutorial.

Select iPodVisualizer in the Project Navigator and then select iPodVisualizer under TARGETS. Choose the Build Phases tab, expand the Link Binary With Libraries section, then click the + (plus) button.

Search for AVFoundation.framework in the pop up list, select it, and click Add. The framework should now appear in your project.

It’s time to write some code. Open ViewController.m and make the following changes:

// Add to the #imports section at the top of the file
#import <AVFoundation/AVFoundation.h>
 
// Add the following under the comment that reads "Add properties here"
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;

This imports the AVFoundation.h header file so you can access AVAudioPlayer, and then adds a property that will hold the AVAudioPlayer instance your app will use to play audio.

And now, it’s time to play a music file.

The starter project includes a music file named DemoSong.m4a in the Resources folder that you can use. Feel free to use a different audio file if you’d like. Just remember, only the following audio codecs are supported on iOS devices for playback:

  • AAC (MPEG-4 Advanced Audio Coding)
  • ALAC (Apple Lossless)
  • HE-AAC (MPEG-4 High Efficiency AAC)
  • iLBC (internet Low Bitrate Codec, another format for speech)
  • IMA4 (IMA/ADPCM)
  • Linear PCM (uncompressed, linear pulse-code modulation)
  • MP3 (MPEG-1 audio layer 3)
  • µ-law and a-law

Still in ViewController.m, add the following method:

- (void)configureAudioPlayer {
    NSURL *audioFileURL = [[NSBundle mainBundle] URLForResource:@"DemoSong" withExtension:@"m4a"];
    NSError *error;
    self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:audioFileURL error:&error];
    if (error) {
        NSLog(@"%@", [error localizedDescription]);
    }
    [_audioPlayer setNumberOfLoops:-1];
}

This method creates a reference to the music file and stores it as an audioFileURL. It then create a newAVAudioPlayer instance initialized with the audioFileURL and sets its numberOfLoops property to -1 to make the audio loop forever.

Note: If you decide to use a music file other than the provided one, do remember to add the new file to the Xcode project and to change the music file name (and perhaps the extension) in the above method.

Add the following line to the end of viewDidLoad:

[self configureAudioPlayer];

By calling configureAudioPlayer: in viewDidLoad:, you set up the audio player as soon as the view loads and so are able to press the play button on app start and have the app play your song.

Now add the following line inside playPause, just after the comment that reads // Pause audio here:

[_audioPlayer pause];

Next, add the following line in the same method, just after the comment that reads // Play audio here:

[_audioPlayer play];

Tapping the play/pause button calls playPause. The code you just added tells audioPlayer to play or pause according to its current state as defined by _isPlaying. As the name indicates, this property identifies whether the audio player is currently playing audio or not.

Now build and run. If you did everything correctly the app will look exactly the same. But now you can play/pause your music.

Take this brief moment to get your funk on! :]

Selecting a Song

A music player that just plays one song, no matter how cool that song may be, isn’t very useful. So you’ll add the ability to play audio from the device’s music library.

If you don’t plan on running on a device, or know how to set that up already, you can skip to the next section.

The starter project you downloaded is set up so that when the user chooses a song from the media picker, a URL for the selected song is passed to playURL: inside ViewController.m. Currently, playURL: just toggles the icon on the play/pause button.

Inside ViewController.m, add the following code to playURL: just after the comment that reads // Add audioPlayer configurations here:

self.audioPlayer = [[AVAudioPlayer alloc] initWithContentsOfURL:url error:nil];
[_audioPlayer setNumberOfLoops:-1];

The above code is much the same as what you wrote in configureAudioPlayer. However, instead of hardcoding the filename, you create a new AVAudioPlayer instance with the URL passed into the method.

Build and run on a device, and you’ll be able to choose and play a song from your music library.

Note: If you have iTunes Match, you may see items in the media picker that are not actually on your device. If you choose a song that is not stored locally, the app dismisses the media picker and does not play the audio. So if you want to hear (and soon see) something, be sure to choose a file that’s actually there :]

While running the project on a device, press the home button. You’ll notice that your music is paused. This isn’t a very good experience for a music player application, if a music player is what you’re after.

You can configure your app so that the music will continue to play even when the app enters the background. Keep in mind that this is another feature not supported in the iPhone Simulator, so run the app on a device if you want to see how it works.

To play music in the background, you need to do two things: set the audio session category, then declare the app as supporting background execution.

First, set the audio session category.

An audio session is the intermediary between your application and iOS for configuring audio behavior. Configuring your audio session establishes basic audio behavior for your application. You set your audio session category according to what your app does and how you want it to interact with the device and the system.

Add the following new method to ViewController.m:

 - (void)configureAudioSession {
     NSError *error;
     [[AVAudioSession sharedInstance] setCategory:AVAudioSessionCategoryPlayback error:&error];
 
     if (error) {
         NSLog(@"Error setting category: %@", [error description]);
     }
 }

In configureAudioSession, you get the audio session using [AVAudioSession sharedInstance] and set its category to AVAudioSessionCategoryPlayback. This identifies that the current audio session will be used for playing back audio (as opposed to recording or processing audio).

Add the following line to viewDidLoad, just before the call to [self configureAudioPlayer];:

[self configureAudioSession];

This calls configureAudioSession to configure the audio session.

Note: To learn more about audio sessions, read Apple’s Audio Session Programming Guide. Or take a look at our Background Modes in iOS Tutorial which also covers the topic, albeit not in as much detail.

Now you have to declare that your app supports background execution.

Open iPodVisualizer-Info.plist (it’s in the Supporting Files folder), select the last line, and click the plus button to add a new item. Select Required background modes as the Key from the dropdown, and the type of the item will change to Array automatically. (If it does not automatically become Array, double check the Key.)

Expand the item, set the value of Item0 to App plays audio. (If you have a wide Xcode window, you might not notice that the value is a dropdown list. But you can access the list by simply tapping the dropdown icon at the end of the field.)

When you are done, build and run on a device, pick a song and play it, press the home button, and this time your music should continue to play without interruption even if your app is in the background.

Visualizing with Music

Your music visualizer will be based on a UIKit particle system. If you don’t know much about particle systems, you may want to read UIKit Particle Systems In iOS 5 or How To Make a Letter / Word Game with UIKit: Part 3/3 to familiarize yourself with the necessary background information; this tutorial does not go into detail explaining the particle system basics.

First, add the QuartzCore.framework to your project (the same way you added theAVFoundation.framework).

Now choose File/New/File…, and select the iOS/Cocoa Touch/Objective-C class template. Name the class VisualizerView, make it a subclass of UIView, click Next and then Create.

Select VisualizerView.m in the Xcode Project Navigator and change its extension from .m to .mm. (You can rename it by clicking the file twice slowly in the Project Navigator. That is, do not click it fast enough to be considered a double-click.) The .mm extension tells Xcode that this file needs to be compiled as C++, which is necessary because later it will access the C++ class MeterTable.

Open VisualizerView.mm and replace its contents with the following:

#import "VisualizerView.h"
#import <QuartzCore/QuartzCore.h>
 
@implementation VisualizerView {
    CAEmitterLayer *emitterLayer;
}
 
// 1
+ (Class)layerClass {
    return [CAEmitterLayer class];
}
 
- (id)initWithFrame:(CGRect)frame
{
    self = [super initWithFrame:frame];
    if (self) {
        [self setBackgroundColor:[UIColor blackColor]];
        emitterLayer = (CAEmitterLayer *)self.layer;
 
        // 2
        CGFloat width = MAX(frame.size.width, frame.size.height);
        CGFloat height = MIN(frame.size.width, frame.size.height);
        emitterLayer.emitterPosition = CGPointMake(width/2, height/2);
        emitterLayer.emitterSize = CGSizeMake(width-80, 60);
        emitterLayer.emitterShape = kCAEmitterLayerRectangle;
        emitterLayer.renderMode = kCAEmitterLayerAdditive;
 
        // 3
        CAEmitterCell *cell = [CAEmitterCell emitterCell];
        cell.name = @"cell";
        cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
        // 4
        cell.color = [[UIColor colorWithRed:1.0f green:0.53f blue:0.0f alpha:0.8f] CGColor];
        cell.redRange = 0.46f;
        cell.greenRange = 0.49f;
        cell.blueRange = 0.67f;
        cell.alphaRange = 0.55f;
 
        // 5
        cell.redSpeed = 0.11f;
        cell.greenSpeed = 0.07f;
        cell.blueSpeed = -0.25f;
        cell.alphaSpeed = 0.15f;
 
        // 6
        cell.scale = 0.5f;
        cell.scaleRange = 0.5f;
 
        // 7
        cell.lifetime = 1.0f;
        cell.lifetimeRange = .25f;
        cell.birthRate = 80;
 
        // 8
        cell.velocity = 100.0f;
        cell.velocityRange = 300.0f;
        cell.emissionRange = M_PI * 2;
 
        // 9
        emitterLayer.emitterCells = @[cell];
    }
    return self;
}
 
@end

The above code mainly configures a UIKit particle system, as follows:

  1. Overrides layerClass to return CAEmitterLayer, which allows this view to act as a particle emitter.
  2. Shapes the emitter as a rectangle that extends across most of the center of the screen. Particles are initially created within this area.
  3. Creates a CAEmitterCell that renders particles using particleTexture.png, included in the starter project.
  4. Sets the particle color, along with a range by which each of the red, green, and blue color components may vary.
  5. Sets the speed at which the color components change over the lifetime of the particle.
  6. Sets the scale and the amount by which the scale can vary for the generated particles.
  7. Sets the amount of time each particle will exist to between .75 and 1.25 seconds, and sets it to create 80 particles per second.
  8. Configures the emitter to create particles with a variable velocity, and to emit them in any direction.
  9. Adds the emitter cell to the emitter layer.

Again, read the previously mentioned tutorials if you would like to know more about the fun things you can do with UIKit particle systems and how the above configuration values affects the generated particles.

Next open ViewController.m and make the following changes:

//Add with the other imports
#import "VisualizerView.h"
 
//Add with the other properties
@property (strong, nonatomic) VisualizerView *visualizer;

Now add the following to viewDidLoad, just before the line that reads [self configureAudioPlayer];:

self.visualizer = [[VisualizerView alloc] initWithFrame:self.view.frame];
[_visualizer setAutoresizingMask:UIViewAutoresizingFlexibleHeight | UIViewAutoresizingFlexibleWidth];
[_backgroundView addSubview:_visualizer];

This creates a VisualizerView instance that will fill its parent view and adds it to _backgroundView. (_backgroundView was defined as part of the starter project, and is just a view layered behind the music controls.)

Build and run, you will see the particle system in action immediately:

While that looks very cool indeed, you want the particles to “beat” in sync with your music. This is done by changing the size of particles when the decibel level of the music changes.

First, open VisualizerView.h and make the following changes:

//Add with the other imports
#import <AVFoundation/AVFoundation.h>
 
//Add within the @interface and @end lines
@property (strong, nonatomic) AVAudioPlayer *audioPlayer;

The new property will give your visualizer access to the app’s audio player, and hence the audio levels, but before you can use that information, you need to set up one more thing.

Switch to ViewController.m and search for setNumberOfLoops. If you skipped the section about running on the device, it will appear only once (in configureAudioPlayer); otherwise, it will appear twice (inconfigureAudioPlayer and in playURL:).

Add the following code just after any occurrence of the line [_audioPlayer setNumberOfLoops:-1];:

[_audioPlayer setMeteringEnabled:YES];
[_visualizer setAudioPlayer:_audioPlayer];

With the above code, you instruct the AVAudioPlayer instance to make audio-level metering data available. You then pass _audioPlayer to the _visualizer so that it can access that data.

Now switch to VisualizerView.mm and modify it as follows:

// Add with the other imports
#import "MeterTable.h"
 
// Change the private variable section of the implementation to look like this
@implementation VisualizerView {
    CAEmitterLayer *emitterLayer;
    MeterTable meterTable;
}

The above code gives you access to a MeterTable instance named meterTable. The starter project includes the C++ class MeterTable, which you’ll use to help process the audio levels fromAVAudioPlayer.

What’s all this talk about metering? It should be easy to understand once you see the image below:

You’ve most likely seen something similar on the front of a sound system, bouncing along to the music. It simply shows you the relative intensity of the audio at any given time. MeterTable is a helper class that can be used to divide decibel values into ranges used to produce images like the one above.

You will use MeterTable to convert values into a range from 0 to 1 and you will use that new value to adjust the size of the particles in your music visualizer.

Add the following method to VisualizerView.mm:

- (void)update
{
    // 1
    float scale = 0.5;
    if (_audioPlayer.playing )
    {
        // 2
        [_audioPlayer updateMeters];
 
        // 3
        float power = 0.0f;
        for (int i = 0; i < [_audioPlayer numberOfChannels]; i++) {
            power += [_audioPlayer averagePowerForChannel:i];
        }
        power /= [_audioPlayer numberOfChannels];
 
        // 4
        float level = meterTable.ValueAt(power);
        scale = level * 5;
    }
 
    // 5
    [emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.scale"];
}

Each time the above method is called, it updates the size of the visualizer’s particles. Here’s how it works:

  1. You set scale to a default value of 0.5 and then check to see whether or not _audioPlayer is playing.
  2. If it is playing, you call updateMeters on _audioPlayer, which refreshes the AVAudioPlayer data based on the current audio.
  3. This is the meat of the method. For each audio channel (e.g. two for a stereo file), the average power for that channel is added to power. The average power is a decibel value. After the powers of all the channels have been added together, power is divided by the number of channels. This means power now holds the average power, or decibel level, for all of the audio.
  4. Here you pass the calculated average power value to meterTable‘s ValueAt method. It returns a value from 0 to 1, which you multiply by 5 and then set that as the scale. Multiplying by 5 accentuates the music’s effect on the scale.

    Note: Why use meterTable to convert power‘s value? The reason is that it simplifies the code that you have to write. Otherwise, your code will have to cover broad range of values returned by averagePowerForChannel. A return value of 0 indicates full scale, or maximum power; a return value of -160 indicates minimum power (that is, near silence). But the signal provided to the audio player may actually exceed the range of what’s considered full scale, so values can still go beyond those limits. Using meterTable gives you a nice value from 0 to 1. No fuss, no muss.

  5. Finally, the scale of the emitter’s particles is set to the new scale value. (If _audioPlayer was not playing, this will be the default scale of 0.5; otherwise, it will be some value based on the current audio levels.

Right now your app doesn’t call update and so the new code has no effect. Fix that by modifyinginitWithFrame: in VisualizerView.mm by adding the following lines just after emitterLayer.emitterCells = @[cell]; (but still inside the closing curly brace):

CADisplayLink *dpLink = [CADisplayLink displayLinkWithTarget:self selector:@selector(update)];
[dpLink addToRunLoop:[NSRunLoop currentRunLoop] forMode:NSRunLoopCommonModes];

Here you set up a CADisplayLink. A CADisplayLink is a timer that allows your application to synchronize its drawing to the refresh rate of the display. That is, it behaves much like a NSTimer with a 1/60 second time interval, except that it’s guaranteed to be called each time the device prepares to redraw the screen, which is usually at a rate of 60 times per second.

The first line you added above creates an instance of CADisplayLink set up to call update on the targetself. That means it will call the update method you just defined during each screen refresh.

The second line calls addToRunLoop:forMode:, which starts the display link timer.

Note: Adding the CADisplayLink to a run loop is a low-level concept related to threading. For this tutorial, you just need to understand that the CADisplayLink will be called for every screen update. But if you want to learn more, you can check out the class references for CADisplayLink orNSRunLoop, or read through the Run Loops chapter in Apple’s Threading Programming Guide.

Now build, run, and play some music. You will notice that particles will change their size but they don’t “beat” with the music. This is because the change we make can not affect the particles that already exist on the screen. Only new particles are changed.

This needs to be fixed.

Open VisualizerView.mm and modify initWithFrame: as follows:

    // Remove this line
    // cell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
    // And replace it with the following lines
    CAEmitterCell *childCell = [CAEmitterCell emitterCell];
    childCell.name = @"childCell";
    childCell.lifetime = 1.0f / 60.0f;
    childCell.birthRate = 60.0f;
    childCell.velocity = 0.0f;
 
    childCell.contents = (id)[[UIImage imageNamed:@"particleTexture.png"] CGImage];
 
    cell.emitterCells = @[childCell];

Like CAEmitterLayerCAEmitterCell also has a property named emitterCells. This means that a CAEmitterCellcan contain another CAEmitterCell. This results in particles emitting particles. That’s right, folks, it’sturtles particles all the way down! :]

Also notice that you set the child’s lifetime to 1/60 seconds. This means that particles emitted bychildCell will have a lifetime which is the same length as a screen refresh. You set birthRate to 60, which means that there will be 60 particles emitted per second. Since each dies in 1/60th of a second, there will always be a particle created when the previous particle dies. And you thought your day was short :]

Build and run, you will see the particle system works the same as it did before – but it still doesn’t beat to the music. You can try setting birthRate to 30 to help you understand how the setting works (just don’t forget to set it back to 60).

So how do you get the particle system to beat to the music?

The last line of update currently looks like this:

[emitterLayer setValue:@scale forKeyPath:@"emitterCells.cell.scale"];

Replace that line with the following:

[emitterLayer setValue:@(scale) forKeyPath:@"emitterCells.cell.emitterCells.childCell.scale"];

Now build and run, you will see that all the particles beat with your music now.


So what did the above change do?

Particles are created and destroyed at the same rate as a screen refresh. That means that every time the screen is redrawn, a new set of particles is created and the previous set is destroyed. Since new particles are always created with a size calculated from the audio-levels at that moment, the particles appear to pulse with the music.

Congratulations, you have just made a cool music visualizer application!

Where to go from here?

Here is the complete example project with all of the code from the above tutorial.

This tutorial gave you a basic idea as to how to add a music visualisation system to your app. But you can take it further:

  • You can add more music controls to make the project a fully functional music player.
  • You could create a slightly more sophisticated visualizer that modified a separate particle system for each audio channel, rather than blending all audio channels into a single value.
  • Try creating a different kinds of particle systems (this tool, UIEffectDesigner, may help).
  • Or maybe try changing the shape of your emitter layer and moving it around within the view.

While you’re at it, check out Apple’s sample project aurioTouch2. It’s an advanced use of music visualization and a great way to learn more about the subject.

Have fun!

时间: 2024-10-20 04:13:48

How To Make a Music Visualizer in iOS的相关文章

iOS -- SKSpriteNode类

SKSpriteNode类 继承自 SKNode:UIResponder:NSObject 符合 NSCoding(SKNode)NSCopying(SKNode)NSObject(NSObject) 框架  /System/Library/Frameworks/SpriteKit.framework 可用性 可用于iOS 7.0或者更晚的版本 声明于 SKSpriteNode.h 参考指南 Sprite Kit Progamming Guide 概览 重要提示:这是一个初步的API或者开发技术

使用fruitstrap实现命令行将IPA包安装到iOS设备上

Requirements Mac OS X. Tested on Snow Leopard only. You need to have a valid iPhone development certificate installed. Xcode must be installed, along with the SDK for your iOS version. Usage fruitstrap [-d] -b <app> [device_id] Optional -d flag laun

iOS -- SKScene类

SKScene类 继承自 SKEffectNode:SKNode:UIResponder:NSObject 符合 NSCoding(SKNode)NSCopying(SKNode)NSObject(NSObject) 框架  /System/Library/Frameworks/SpriteKit.framework 可用性 可用于iOS 7.0或者更晚的版本 声明于 SKScene.h 参考指南 Sprite Kit Progamming Guide 概览 重要提示:这是一个初步的API或者开

iOS -- SKPhysicsWorld类

SKPhysicsWorld类 继承自 NSObject 符合 NSCodingNSObject(NSObject) 框架  /System/Library/Frameworks/SpriteKit.framework 可用性 可用于iOS 7.0或者更晚的版本 声明于 SKPhysicsWorld.h 参考指南 Sprite Kit Progamming Guide 概览 重要提示:这是一个初步的API或者开发技术文档.虽然已经审阅了本文档的技术准确性,但是它不是最终的版本.本机密信息仅适用于

iOS证书说明和发布

1.首先通过钥匙串访问——证书助理——从证书颁发机构请求证书——填写证书信息(邮箱,常用名称,存储到磁盘)——存储为(自定义名称.certSigningReuqest,简称CSR文件,只是为了提交到苹果开发者账号中,然后就没用了)到本地 2.苹果开发者账号中,创建证书(Development和Production)——上传CSR文件——下载证书运行 ( xxx.cer文件) 注意:只有在当前电脑中生成本地生成证书,上传到苹果开发账号,然后下载cer文件运行后,钥匙串中才有证书以及对应的秘钥 如果

iOS开发——项目实战总结&amp;UITableView性能优化与卡顿问题

UITableView性能优化与卡顿问题 1.最常用的就是cell的重用, 注册重用标识符 如果不重用cell时,每当一个cell显示到屏幕上时,就会重新创建一个新的cell 如果有很多数据的时候,就会堆积很多cell.如果重用cell,为cell创建一个ID 每当需要显示cell 的时候,都会先去缓冲池中寻找可循环利用的cell,如果没有再重新创建cell 2.避免cell的重新布局 cell的布局填充等操作 比较耗时,一般创建时就布局好 如可以将cell单独放到一个自定义类,初始化时就布局好

解决ios下的微信打开的页面背景音乐无法自动播放

后面的项目发现,还有两个坑,需要注意下: ·本文的解决方案的核心是利用了 微信/易信 在ready的时候会有个 WeixinJSBridgeReady/YixinJSBridgeReady事件,通过监听这个事件来触发的.那有个坑就是 如果微信已经ready了,但还没执行到你监听这个ready事件的代码,那么你的监听是没用的,所以最理想的情况是,监听的js放在head前面(放在css外链之前),确保最新执行,切记!切记!. ·另一个坑就是,本文的解决方案只适合一开始就播放的背景音乐.如果你是做那种

iOS程序执行顺序和UIViewController 的生命周期(整理)

说明:此文是自己的总结笔记,主要参考: iOS程序的启动执行顺序 AppDelegate 及 UIViewController 的生命周期 UIView的生命周期 言叶之庭.jpeg 一. iOS程序的启动执行顺序 程序启动顺序图 iOS启动原理图.png 具体执行流程 程序入口进入main函数,设置AppDelegate称为函数的代理 程序完成加载[AppDelegate application:didFinishLaunchingWithOptions:] 创建window窗口 程序被激活[

iOS库--.a与.framework

一.什么是库? 库是共享程序代码的方式,一般分为静态库和动态库. 二.静态库与动态库的区别? 静态库:链接时完整地拷贝至可执行文件中,被多次使用就有多份冗余拷贝. 动态库:链接时不复制,程序运行时由系统动态加载到内存,供程序调用,系统只加载一次,多个程序共用,节省内存. 三.iOS里静态库形式? .a和.framework 四.iOS里动态库形式? .dylib和.framework 五.framework为什么既是静态库又是动态库? 系统的.framework是动态库,我们自己建立的.fram