Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

bc::copy, copy vector data to cube, don't support, now! #65

Open
xinsuinizhuan opened this issue Apr 27, 2020 · 9 comments
Open

bc::copy, copy vector data to cube, don't support, now! #65

xinsuinizhuan opened this issue Apr 27, 2020 · 9 comments

Comments

@xinsuinizhuan
Copy link

vector<vector<double>> matricsInputSeries;
cube inputs(img_sz, network->m_batch_size, training_sets);
bc::copy(inputs[i][j].get_stream(),matricsInputSeries[m_batch_size + j].begin(), matricsInputSeries[i * m_batch_size + j].end(), inputs[i][j].cw_begin());

now i can't use bc::copy, copy vector data to cube input!

@xinsuinizhuan xinsuinizhuan changed the title bc::copy, copy vector data to cube, don't supprot,now! bc::copy, copy vector data to cube, don't support, now! Apr 27, 2020
josephjaspers added a commit that referenced this issue Apr 27, 2020
@josephjaspers
Copy link
Owner

Hi, I've added tests to cover this issue
b035502
6ed9e59

However I was able to successfully compile the above code (Windows VS2019) and have tests run successfully. Without having to change any internal code, so I am unsure why it wouldn't work.

Could you post the error and compilation output?
Thanks!

@josephjaspers
Copy link
Owner

Additionally I haven't yet documented but there is a new class named

VecList which behaves like a regular bc::Vector but also has the "push_back" method.
Ergo, you might be able to do something like...

bc::VecList<double>  veclist; 
for (int i = 0; i < other_data.size(); ++i) {
     veclist.push_back(other_data[i]); 
}

bc::Cube<double> data(size, batch_sz, other_size); 
data[0][0] = veclist; 

@xinsuinizhuan
Copy link
Author

I meet a strange problem, every functions is ok, when i do not call them. But when i call the function, so many compile errors.
do not call fuction:
图片
call function:
图片
error.txt

blackcat_tensors_consoledemo.zip

josephjaspers added a commit that referenced this issue Apr 29, 2020
@josephjaspers
Copy link
Owner

Hi, I fixed the issue.
I had previously removed utility support (print/copying) from expressions though I have re-added them in the last few commits.

A while back a re-ordered the template arguments for the Allocator class.
So in your code you have to change

	using allocator_type = bc::Allocator<System, value_type>

to

        using allocator_type = bc::Allocator<value_type, System>;

So if you pull the newest commits and make those changes your code should be able to work.
(I was able to successfully compile it on my windows VM).

I added tests to cover this issue.
Let me know if it works!

Thanks!

@xinsuinizhuan
Copy link
Author

thanks, it works. Other problem is that, i predict with the trained data, it seems well, but i predict with the untrained data, it seems not good, i try to decrease the epoch, but it seems not work. I don't know why?

@josephjaspers
Copy link
Owner

Hmmm... I am not sure why, if you send me what you are working on i could take a look?

@xinsuinizhuan
Copy link
Author

xinsuinizhuan commented May 6, 2020

like this:

auto network = bc::nn::neuralnetwork(
bc::nn::lstm(system_tag, 784/4, 128),
bc::nn::lstm(system_tag, 128, 64),
bc::nn::feedforward(system_tag, 64, 10),
bc::nn::softmax(system_tag, 10),
bc::nn::logging_output_layer(system_tag, 10, bc::nn::RMSE).skip_every(100)
);

bc::print("Neural Network architecture:");
bc::print(network.get_string_architecture());


network.set_learning_rate(0.001);
network.set_batch_size(batch_size);

it break at, network.set_learning_rate(0.001);
图片

mnist_test.txt

@josephjaspers
Copy link
Owner

I will check!

@josephjaspers
Copy link
Owner

Fixed!
87e9749

Silly bug (infinite recursion), added set_learning_rate to the mnist_recurrent example to catch this bug in the future!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants