使用 C # 播放流中的音频

在 C # 中有没有一种方法可以直接播放来自 系统 IOS 流的音频(例如 MP3) ,比如从 WebRequest 返回的音频,而不需要将数据临时保存到磁盘上?


NAudio解决方案

NAudio1.3的帮助下,可以:

  1. 将一个 MP3文件从一个 URL 加载到一个 MemoryStream 中
  2. 在完全加载后将 MP3数据转换为波形数据
  3. 使用 NAudio的 WaveOut 类回放波形数据

如果能够播放一个只加载了一半的 MP3文件就好了,但是由于 NAudio库的设计,这似乎是不可能的。

这个函数将完成这项工作:

    public static void PlayMp3FromUrl(string url)
{
using (Stream ms = new MemoryStream())
{
using (Stream stream = WebRequest.Create(url)
.GetResponse().GetResponseStream())
{
byte[] buffer = new byte[32768];
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
}


ms.Position = 0;
using (WaveStream blockAlignedStream =
new BlockAlignReductionStream(
WaveFormatConversionStream.CreatePcmStream(
new Mp3FileReader(ms))))
{
using (WaveOut waveOut = new WaveOut(WaveCallbackInfo.FunctionCallback()))
{
waveOut.Init(blockAlignedStream);
waveOut.Play();
while (waveOut.PlaybackState == PlaybackState.Playing )
{
System.Threading.Thread.Sleep(100);
}
}
}
}
}
167051 次浏览

The SoundPlayer class can do this. It looks like all you have to do is set its Stream property to the stream, then call Play.

edit
I don't think it can play MP3 files though; it seems limited to .wav. I'm not certain if there's anything in the framework that can play an MP3 file directly. Everything I find about that involves either using a WMP control or interacting with DirectX.

Edit: Answer updated to reflect changes in recent versions of NAudio

It's possible using the NAudio open source .NET audio library I have written. It looks for an ACM codec on your PC to do the conversion. The Mp3FileReader supplied with NAudio currently expects to be able to reposition within the source stream (it builds an index of MP3 frames up front), so it is not appropriate for streaming over the network. However, you can still use the MP3Frame and AcmMp3FrameDecompressor classes in NAudio to decompress streamed MP3 on the fly.

I have posted an article on my blog explaining how to play back an MP3 stream using NAudio. Essentially you have one thread downloading MP3 frames, decompressing them and storing them in a BufferedWaveProvider. Another thread then plays back using the BufferedWaveProvider as an input.

NAudio wraps the WaveOutXXXX API. I haven't looked at the source, but if NAudio exposes the waveOutWrite() function in a way that doesn't automatically stop playback on each call, then you should be able to do what you really want, which is to start playing the audio stream before you've received all the data.

Using the waveOutWrite() function allows you to "read ahead" and dump smaller chunks of audio into the output queue - Windows will automatically play the chunks seamlessly. Your code would have to take the compressed audio stream and convert it to small chunks of WAV audio on the fly; this part would be really difficult - all the libraries and components I've ever seen do MP3-to-WAV conversion an entire file at a time. Probably your only realistic chance is to do this using WMA instead of MP3, because you can write simple C# wrappers around the multimedia SDK.

Bass can do just this. Play from Byte[] in memory or a through file delegates where you return the data, so with that you can play as soon as you have enough data to start the playback..

I haven't tried it from a WebRequest, but both the Windows Media Player ActiveX and the MediaElement (from WPF) components are capable of playing and buffering MP3 streams.

I use it to play data coming from a SHOUTcast stream and it worked great. However, I'm not sure if it will work in the scenario you propose.

I've always used FMOD for things like this because it's free for non-commercial use and works well.

That said, I'd gladly switch to something that's smaller (FMOD is ~300k) and open-source. Super bonus points if it's fully managed so that I can compile / merge it with my .exe and not have to take extra care to get portability to other platforms...

(FMOD does portability too but you'd obviously need different binaries for different platforms)

I wrapped the MP3 decoder library and made it available for .NET developers as mpg123.net.

Included are the samples to convert MP3 files to PCM, and read ID3 tags.

I slightly modified the topic starter source, so it can now play a not-fully-loaded file. Here it is (note, that it is just a sample and is a point to start from; you need to do some exception and error handling here):

private Stream ms = new MemoryStream();
public void PlayMp3FromUrl(string url)
{
new Thread(delegate(object o)
{
var response = WebRequest.Create(url).GetResponse();
using (var stream = response.GetResponseStream())
{
byte[] buffer = new byte[65536]; // 64KB chunks
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
var pos = ms.Position;
ms.Position = ms.Length;
ms.Write(buffer, 0, read);
ms.Position = pos;
}
}
}).Start();


// Pre-buffering some data to allow NAudio to start playing
while (ms.Length < 65536*10)
Thread.Sleep(1000);


ms.Position = 0;
using (WaveStream blockAlignedStream = new BlockAlignReductionStream(WaveFormatConversionStream.CreatePcmStream(new Mp3FileReader(ms))))
{
using (WaveOut waveOut = new WaveOut(WaveCallbackInfo.FunctionCallback()))
{
waveOut.Init(blockAlignedStream);
waveOut.Play();
while (waveOut.PlaybackState == PlaybackState.Playing)
{
System.Threading.Thread.Sleep(100);
}
}
}
}

I've tweaked the source posted in the question to allow usage with Google's TTS API in order to answer the question here:

bool waiting = false;
AutoResetEvent stop = new AutoResetEvent(false);
public void PlayMp3FromUrl(string url, int timeout)
{
using (Stream ms = new MemoryStream())
{
using (Stream stream = WebRequest.Create(url)
.GetResponse().GetResponseStream())
{
byte[] buffer = new byte[32768];
int read;
while ((read = stream.Read(buffer, 0, buffer.Length)) > 0)
{
ms.Write(buffer, 0, read);
}
}
ms.Position = 0;
using (WaveStream blockAlignedStream =
new BlockAlignReductionStream(
WaveFormatConversionStream.CreatePcmStream(
new Mp3FileReader(ms))))
{
using (WaveOut waveOut = new WaveOut(WaveCallbackInfo.FunctionCallback()))
{
waveOut.Init(blockAlignedStream);
waveOut.PlaybackStopped += (sender, e) =>
{
waveOut.Stop();
};
waveOut.Play();
waiting = true;
stop.WaitOne(timeout);
waiting = false;
}
}
}
}

Invoke with:

var playThread = new Thread(timeout => PlayMp3FromUrl("http://translate.google.com/translate_tts?q=" + HttpUtility.UrlEncode(relatedLabel.Text), (int)timeout));
playThread.IsBackground = true;
playThread.Start(10000);

Terminate with:

if (waiting)
stop.Set();

Notice that I'm using the ParameterizedThreadDelegate in the code above, and the thread is started with playThread.Start(10000);. The 10000 represents a maximum of 10 seconds of audio to be played so it will need to be tweaked if your stream takes longer than that to play. This is necessary because the current version of NAudio (v1.5.4.0) seems to have a problem determining when the stream is done playing. It may be fixed in a later version or perhaps there is a workaround that I didn't take the time to find.