GraphSage Model¶
- class graphdatascience.model.graphsage_model.GraphSageModel¶
Represents a GraphSAGE model in the model catalog. Construct this using
gds.beta.graphSage.train()
.- creation_time() Any ¶
Get the creation time of the model.
- Returns:
The creation time of the model.
- drop(failIfMissing: bool = False) Series[Any] ¶
Drop the model.
- Parameters:
failIfMissing – If True, an error is thrown if the model does not exist. If False, no error is thrown.
- Returns:
The result of the drop operation.
- exists() bool ¶
Check whether the model exists.
- Returns:
True if the model exists, False otherwise.
- graph_schema() Series[Any] ¶
Get the graph schema of the model.
- Returns:
The graph schema of the model.
- loaded() bool ¶
Check whether the model is loaded in memory.
- Returns:
True if the model is loaded in memory, False otherwise.
- metrics() Series[Any] ¶
Get the metrics of the model.
- Returns:
The metrics of the model.
- model_info() dict[str, Any] ¶
Get the model info of the model.
- Returns:
The model info of the model.
- name() str ¶
Get the name of the model.
- Returns:
The name of the model.
- predict_mutate(G: Graph, **config: Any) Series[Any] ¶
Predict on the given graph using the model and mutate the graph with the results.
- Parameters:
G – The graph to predict on.
**config – The config for the prediction.
- Returns:
The result of mutate operation.
- predict_mutate_estimate(G: Graph, **config: Any) Series[Any] ¶
Estimate the memory needed to predict on the given graph using the model.
- Parameters:
G – The graph to predict on.
**config – The config for the prediction.
- Returns:
The memory needed to predict on the given graph using the model.
- predict_stream(G: Graph, **config: Any) DataFrame ¶
Predict on the given graph using the model and stream the results as DataFrame
- Parameters:
G – The graph to predict on.
**config – The config for the prediction.
- Returns:
The prediction results as DataFrame.
- predict_stream_estimate(G: Graph, **config: Any) Series[Any] ¶
Estimate the prediction on the given graph using the model and stream the results as DataFrame
- Parameters:
G – The graph to predict on.
**config – The config for the prediction.
- Returns:
The prediction results as DataFrame.
- predict_write(G: Graph, **config: Any) Series[Any] ¶
Generate embeddings for the given graph and write the results to the database.
- Parameters:
G – The graph to generate embeddings for.
**config – The config for the prediction.
- Returns:
The result of the write operation.
- predict_write_estimate(G: Graph, **config: Any) Series[Any] ¶
Estimate the memory needed to generate embeddings for the given graph and write the results to the database.
- Parameters:
G – The graph to generate embeddings for.
**config – The config for the prediction.
- Returns:
The memory needed to generate embeddings for the given graph and write the results to the database.
- published() bool ¶
Check whether the model is published.
- Returns:
True if the model is published, False otherwise.
Check whether the model is shared.
- Returns:
True if the model is shared, False otherwise.
- stored() bool ¶
Check whether the model is stored on disk.
- Returns:
True if the model is stored on disk, False otherwise.
- train_config() Series[Any] ¶
Get the train config of the model.
- Returns:
The train config of the model.
- type() str ¶
Get the type of the model.
- Returns:
The type of the model.