Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Questions about numpy compatible versions #291

Open
Aure712 opened this issue Nov 20, 2024 · 3 comments
Open

Questions about numpy compatible versions #291

Aure712 opened this issue Nov 20, 2024 · 3 comments

Comments

@Aure712
Copy link

Aure712 commented Nov 20, 2024

I got an error while running the Deep learning-based approach step: "TypeError: argument 1 must be numpy.ndarray, not Numpy.ndarray", doubting the compatibility of numpy versions if possible, I used numpy version 2.0.2. Would you like to ask if this is the compatibility of numpy version? If so, which version of numpy should I use?

@Sichao25
Copy link
Contributor

In general, I recommend using NumPy <2.0. We haven’t updated to the latest version to maintain compatibility with other popular tools in this field. About your specific issue, would you mind sharing the error traceback so I can pinpoint where the issue occurs?

@Aure712
Copy link
Author

Aure712 commented Nov 22, 2024

Thanks for your reply!

This is the code I used:

st.cs.stardist(adata, equalize=2.0, out_layer='stardist_labels')

fig, ax = st.pl.imshow(adata, 'stain', save_show_or_return='return')
st.pl.imshow(adata, 'stardist_labels', labels=True, alpha=0.5, ax=ax)

Here is the relevant running information that occurs:

|-----> <select> stain layer in AnnData Object
|-----> Equalizing image with CLAHE.
|-----> Running StarDist with model 2D_versatile_fluo.
Found model '2D_versatile_fluo' for 'StarDist2D'.
2024-11-20 19:46:02.936824: E external/local_xla/xla/stream_executor/cuda/cuda_driver.cc:152] failed call to cuInit: INTERNAL: CUDA error: Failed call to cuInit: UNKNOWN ERROR (303)
Loading network weights from 'weights_best.h5'.
Loading thresholds from 'thresholds.json'.
Using default values: prob_thresh=0.479071, nms_thresh=0.3.
/public4/software/conda_env/spateo/lib/python3.12/site-packages/csbdeep/models/base_model.py:316: UserWarning:

skipping normalization step after prediction because number of input and output channels differ.

/public4/software/conda_env/spateo/lib/python3.12/site-packages/keras/src/models/functional.py:225: UserWarning:

The structure of `inputs` doesn't match the expected structure: ['input']. Received: the structure of inputs=*

100%|█████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████████| 110/110 [04:02<00:00,  2.20s/it]
---------------------------------------------------------------------------
TypeError                                 Traceback (most recent call last)
Cell In[8], line 1
----> 1 st.cs.stardist(adata, equalize=2.0, out_layer='stardist_labels')
      3 fig, ax = st.pl.imshow(adata, 'stain', save_show_or_return='return')
      4 st.pl.imshow(adata, 'stardist_labels', labels=True, alpha=0.5, ax=ax)

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/spateo/segmentation/external/stardist.py:188, in stardist(adata, model, tilesize, min_overlap, context, normalizer, equalize, sanitize, layer, out_layer, **kwargs)
    186 if not min_overlap:
    187     n_tiles = (math.ceil(img.shape[0] / tilesize), math.ceil(img.shape[1] / tilesize)) if tilesize > 0 else (1, 1)
--> 188     labels = _stardist(img, model, n_tiles=n_tiles, normalizer=normalizer, **kwargs)
    189 else:
    190     labels = _stardist_big(
    191         img,
    192         model,
   (...)
    197         normalizer=normalizer,
    198     )

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/spateo/segmentation/external/stardist.py:54, in _stardist(img, model, **kwargs)
     51     model = StarDist2D.from_pretrained(model)
     53 lm.main_debug(f"Running StarDist with kwargs {kwargs}")
---> 54 labels, _ = model.predict_instances(img, **kwargs)
     55 return labels

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/stardist/models/base.py:788, in StarDistBase.predict_instances(self, *args, **kwargs)
    775 @functools.wraps(_predict_instances_generator)
    776 def predict_instances(self, *args, **kwargs):
    777     # the reason why the actual computation happens as a generator function
   (...)
    785 
    786     # return last "yield"ed value of generator
    787     r = None
--> 788     for r in self._predict_instances_generator(*args, **kwargs):
    789         pass
    790     return r

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/stardist/models/base.py:758, in StarDistBase._predict_instances_generator(self, img, axes, normalizer, sparse, prob_thresh, nms_thresh, scale, n_tiles, show_tile_progress, verbose, return_labels, predict_kwargs, nms_kwargs, overlap_label, return_predict)
    755     prob_class = None
    757 yield 'nms'  # indicate that non-maximum suppression is starting
--> 758 res_instances = self._instances_from_prediction(_shape_inst, prob, dist,
    759                                                 points=points,
    760                                                 prob_class=prob_class,
    761                                                 prob_thresh=prob_thresh,
    762                                                 nms_thresh=nms_thresh,
    763                                                 scale=(None if scale is None else dict(zip(_axes,scale))),
    764                                                 return_labels=return_labels,
    765                                                 overlap_label=overlap_label,
    766                                                 **nms_kwargs)
    768 # last "yield" is the actual output that would have been "return"ed if this was a regular function
    769 if return_predict:

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/stardist/models/model2d.py:526, in StarDist2D._instances_from_prediction(self, img_shape, prob, dist, points, prob_class, prob_thresh, nms_thresh, overlap_label, return_labels, scale, **nms_kwargs)
    524 # sparse prediction
    525 if points is not None:
--> 526     points, probi, disti, indsi = non_maximum_suppression_sparse(dist, prob, points, nms_thresh=nms_thresh, **nms_kwargs)
    527     if prob_class is not None:
    528         prob_class = prob_class[indsi]

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/stardist/nms.py:177, in non_maximum_suppression_sparse(dist, prob, points, b, nms_thresh, use_bbox, use_kdtree, verbose)
    174     print("non-maximum suppression...")
    175     t = time()
--> 177 inds = non_maximum_suppression_inds(disti, pointsi, scores=probi, thresh=nms_thresh, use_kdtree = use_kdtree, verbose=verbose)
    179 if verbose:
    180     print("keeping %s/%s polyhedra" % (np.count_nonzero(inds), len(inds)))

File /public4/software/conda_env/spateo/lib/python3.12/site-packages/stardist/nms.py:220, in non_maximum_suppression_inds(dist, points, scores, thresh, use_bbox, use_kdtree, verbose)
    217 def _prep(x, dtype):
    218     return np.ascontiguousarray(x.astype(dtype, copy=False))
--> 220 inds = c_non_max_suppression_inds(_prep(dist,  np.float32),
    221                                   _prep(points, np.float32),
    222                                   int(use_kdtree),
    223                                   int(use_bbox),
    224                                   int(verbose),
    225                                   np.float32(thresh))
    227 return inds

TypeError: argument 1 must be numpy.ndarray, not numpy.ndarray

@Sichao25
Copy link
Contributor

Sichao25 commented Dec 5, 2024

Seems to be related to the stardist package. I’m wondering if downgrading resolved your issue?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants