# speech signal segmentation

Started by June 13, 2007
Dear Friends
I am new to DSP. I am trying to implement some speech correction algorithms but I have some problems. Can anyone please provide a matlab m-file code or advice me on how to segment a speech signal which has been read by the wavread function to a vector lets say X. I would be glad for any responses.

Best regards
G.N
Dear Friends,

for

x(n) = u(n) - u(n-10); h(n) = ((0.9)^n) u(n);

I'd like to obtain y(n) which is equal to x(n)*h(n)

I'll normally using convolution to obtain y(n), in this case, I have to
consider when n <0; 0<=n<=9 and n>10.

Someone told me I can use filter function / command to solve it indirectly,

does anyone know how?

Regards

Dean
the problem states that x(n) is a causal sequence where the current value depends on the current sample from u(n) and a delayed sample u(n-10).

so

x(n) = u(n).(1 - z^(-10))

y(n) = u(n).0.9^(n).(1 - z^(-10))

y(n)
------ = 0.9^(n).(1 - z^(-10))
u(n)

you can arrive at the same result by using the fact that two filters h1 and h2 in serial can be replaced by a filter having a transfer function h1.h2.

the RHS is your transfer function if you want to use the filter command
I assume the u(n) is the generic step function with amplitude 1

Dean wrote:
Dear Friends,

for
x(n) = u(n) - u(n-10); h(n) = ((0.9)^n) u(n);

I'd like to obtain y(n) which is equal to x(n)*h(n)

I'll normally using convolution to obtain y(n), in this case, I have to consider when n <0; 0<=n<=9 and n>10.

Someone told me I can use filter function / command to solve it indirectly,

does anyone know how?

Regards

Dean

Amit Pathania
Hi all,

I have been trying to find the reason for observations I have made
regarding MCC compiled code.

My test function looks like:

function [] = test_debug(num_sim)

num_sim = str2num(num_sim);

num_inputs = 2;

for index = 1:num_sim

disp('Start of Simulation');

tic;

input_block = (rand(1,520*8) > 0.5);

for idx = 1:length(input_block)/num_inputs

temp = input_block((idx-1)*num_inputs+1:idx*num_inputs) *
2.^[num_inputs-1:-1:0].';

end

disp(sprintf('End of Simulation Count %d',index));

toc;

end

return;

I have compiled it using :

mcc -v -x -B sgl -d ./Temp test_debug -o ../../Executable/test_debug

I ran the created executable from DOS prompt using

test_debug 5

All the iterations gave me increasing simulation time.

Then I replaced the statement in the for loop:

temp = input_block((idx-1)*num_inputs+1:idx*num_inputs) *
2.^[num_inputs-1:-1:0].';

with

x = 2.^[num_inputs-1:-1:0].';

temp = input_block((idx-1)*num_inputs+1:idx*num_inputs) * x;

Now it gave be uniform simulation time for each iteration in the
executable run. However While if I run this code in Matlab command
prompt both runs equally well. It is expected (a known ) issue??

Thanks and Regards,

Amit Shaw
**********************************************************************
This email and any files transmitted with it are confidential and
intended solely for the use of the individual or entity to whom they