libMesh::PetscDMWrapper Class Reference

#include <petsc_dm_wrapper.h>

Public Member Functions

 PetscDMWrapper ()=default
 
 ~PetscDMWrapper ()
 
void clear ()
 Destroys and clears all build DM-related data. More...
 
void init_and_attach_petscdm (System &system, SNES &snes)
 

Private Member Functions

void init_dm_data (unsigned int n_levels)
 Init all the n_mesh_level dependent data structures. More...
 
DM & get_dm (unsigned int level)
 Get reference to DM for the given mesh level. More...
 
PetscSection & get_section (unsigned int level)
 Get reference to PetscSection for the given mesh level. More...
 
PetscSF & get_star_forest (unsigned int level)
 Get reference to PetscSF for the given mesh level. More...
 
void build_section (const System &system, PetscSection &section)
 Takes System, empty PetscSection and populates the PetscSection. More...
 
void build_sf (const System &system, PetscSF &star_forest)
 Takes System, empty PetscSF and populates the PetscSF. More...
 
void set_point_range_in_section (const System &system, PetscSection &section, std::unordered_map< dof_id_type, dof_id_type > &node_map, std::unordered_map< dof_id_type, dof_id_type > &elem_map, std::map< dof_id_type, unsigned int > &scalar_map)
 Helper function for build_section. More...
 
void add_dofs_to_section (const System &system, PetscSection &section, const std::unordered_map< dof_id_type, dof_id_type > &node_map, const std::unordered_map< dof_id_type, dof_id_type > &elem_map, const std::map< dof_id_type, unsigned int > &scalar_map)
 Helper function for build_section. More...
 
dof_id_type check_section_n_dofs (const System &system, PetscSection &section)
 Helper function to sanity check PetscSection construction. More...
 
void add_dofs_helper (const System &system, const DofObject &dof_object, dof_id_type local_id, PetscSection &section)
 Helper function to reduce code duplication when setting dofs in section. More...
 

Private Attributes

std::vector< std::unique_ptr< DM > > _dms
 Vector of DMs for all grid levels. More...
 
std::vector< std::unique_ptr< PetscSection > > _sections
 Vector of PETScSections for all grid levels. More...
 
std::vector< std::unique_ptr< PetscSF > > _star_forests
 Vector of star forests for all grid levels. More...
 

Detailed Description

This class defines a wrapper around the PETSc DM infrastructure. By coordinating DM data structures with libMesh, we can use libMesh mesh hierarchies for geometric multigrid. Additionally, by setting the DM data, we can additionally (with or without multigrid) define recursive fieldsplits of our variables.

Author
Paul T. Bauman, Boris Boutkov
Date
2018

Definition at line 51 of file petsc_dm_wrapper.h.

Constructor & Destructor Documentation

◆ PetscDMWrapper()

libMesh::PetscDMWrapper::PetscDMWrapper ( )
default

◆ ~PetscDMWrapper()

libMesh::PetscDMWrapper::~PetscDMWrapper ( )

Definition at line 36 of file petsc_dm_wrapper.C.

References clear().

37 {
38  this->clear();
39 }
void clear()
Destroys and clears all build DM-related data.

Member Function Documentation

◆ add_dofs_helper()

void libMesh::PetscDMWrapper::add_dofs_helper ( const System system,
const DofObject dof_object,
dof_id_type  local_id,
PetscSection &  section 
)
private

Helper function to reduce code duplication when setting dofs in section.

Definition at line 407 of file petsc_dm_wrapper.C.

References libMesh::ParallelObject::comm(), libMesh::Parallel::Communicator::get(), ierr, libMesh::DofObject::n_dofs(), libMesh::System::n_vars(), and libMesh::System::number().

Referenced by add_dofs_to_section().

411 {
412  unsigned int total_n_dofs_at_dofobject = 0;
413 
414  // We are assuming variables are also numbered 0 to n_vars()-1
415  for( unsigned int v = 0; v < system.n_vars(); v++ )
416  {
417  unsigned int n_dofs_at_dofobject = dof_object.n_dofs(system.number(), v);
418 
419  if( n_dofs_at_dofobject > 0 )
420  {
421  PetscErrorCode ierr = PetscSectionSetFieldDof( section,
422  local_id,
423  v,
424  n_dofs_at_dofobject );
425 
426  CHKERRABORT(system.comm().get(),ierr);
427 
428  total_n_dofs_at_dofobject += n_dofs_at_dofobject;
429  }
430  }
431 
432  libmesh_assert_equal_to(total_n_dofs_at_dofobject, dof_object.n_dofs(system.number()));
433 
434  PetscErrorCode ierr =
435  PetscSectionSetDof( section, local_id, total_n_dofs_at_dofobject );
436  CHKERRABORT(system.comm().get(),ierr);
437 }
PetscErrorCode ierr

◆ add_dofs_to_section()

void libMesh::PetscDMWrapper::add_dofs_to_section ( const System system,
PetscSection &  section,
const std::unordered_map< dof_id_type, dof_id_type > &  node_map,
const std::unordered_map< dof_id_type, dof_id_type > &  elem_map,
const std::map< dof_id_type, unsigned int > &  scalar_map 
)
private

Helper function for build_section.

This function will set the DoF info for each "point" in the PetscSection.

Definition at line 340 of file petsc_dm_wrapper.C.

References add_dofs_helper(), libMesh::ParallelObject::comm(), libMesh::Parallel::Communicator::get(), libMesh::System::get_mesh(), libMesh::OrderWrapper::get_order(), ierr, mesh, libMesh::ParallelObject::n_processors(), libMesh::FEType::order, libMesh::ParallelObject::processor_id(), libMesh::Variable::type(), and libMesh::System::variable().

Referenced by build_section().

345 {
346  const MeshBase & mesh = system.get_mesh();
347 
348  PetscErrorCode ierr;
349 
350  // Now we go through and add dof information for each "point".
351  //
352  // In libMesh, for most finite elements, we just associate those DoFs with the
353  // geometric nodes. So can we loop over the nodes we cached in the node_map and
354  // the DoFs for each field for that node. We need to give PETSc the local id
355  // we built up in the node map.
356  for (const auto & nmap : node_map )
357  {
358  const dof_id_type global_node_id = nmap.first;
359  const dof_id_type local_node_id = nmap.second;
360 
361  libmesh_assert( mesh.query_node_ptr(global_node_id) );
362 
363  const Node & node = mesh.node_ref(global_node_id);
364 
365  this->add_dofs_helper(system,node,local_node_id,section);
366  }
367 
368  // Some finite element types associate dofs with the element. So now we go through
369  // any of those with the Elem as the point we add to the PetscSection with accompanying
370  // dofs
371  for (const auto & emap : elem_map )
372  {
373  const dof_id_type global_elem_id = emap.first;
374  const dof_id_type local_elem_id = emap.second;
375 
376  libmesh_assert( mesh.query_elem_ptr(global_elem_id) );
377 
378  const Elem & elem = mesh.elem_ref(global_elem_id);
379 
380  this->add_dofs_helper(system,elem,local_elem_id,section);
381  }
382 
383  // Now add any SCALAR dofs to the PetscSection
384  // SCALAR dofs live on the "last" processor, so only work there if there are any
385  if (system.processor_id() == (system.n_processors()-1))
386  {
387  for (const auto & smap : scalar_map )
388  {
389  const dof_id_type local_id = smap.first;
390  const unsigned int scalar_var = smap.second;
391 
392  // The number of SCALAR dofs comes from the variable order
393  const int n_dofs = system.variable(scalar_var).type().order.get_order();
394 
395  ierr = PetscSectionSetFieldDof( section, local_id, scalar_var, n_dofs );
396  CHKERRABORT(system.comm().get(),ierr);
397 
398  // In the SCALAR case, there are no other variables associate with the "point"
399  // the total number of dofs on the point is the same as that for the field
400  ierr = PetscSectionSetDof( section, local_id, n_dofs );
401  CHKERRABORT(system.comm().get(),ierr);
402  }
403  }
404 
405 }
MeshBase & mesh
void add_dofs_helper(const System &system, const DofObject &dof_object, dof_id_type local_id, PetscSection &section)
Helper function to reduce code duplication when setting dofs in section.
PetscErrorCode ierr
uint8_t dof_id_type
Definition: id_types.h:64

◆ build_section()

void libMesh::PetscDMWrapper::build_section ( const System system,
PetscSection &  section 
)
private

Takes System, empty PetscSection and populates the PetscSection.

Take the System in its current state and an empty PetscSection and then populate the PetscSection. The PetscSection is comprised of global "point" numbers, where a "point" in PetscDM parlance is a geometric entity: node, edge, face, or element. Then, we also add the DoF numbering for each variable for each of the "points". The PetscSection, together the with PetscSF will allow for recursive fieldsplits from the command line using PETSc.

Definition at line 98 of file petsc_dm_wrapper.C.

References add_dofs_to_section(), check_section_n_dofs(), libMesh::ParallelObject::comm(), libMesh::Parallel::Communicator::get(), ierr, libMesh::System::n_local_dofs(), libMesh::System::n_vars(), libMesh::on_command_line(), set_point_range_in_section(), and libMesh::System::variable_name().

Referenced by init_and_attach_petscdm().

99 {
100  START_LOG ("build_section()", "PetscDMWrapper");
101 
102  PetscErrorCode ierr;
103  ierr = PetscSectionCreate(system.comm().get(),&section);
104  CHKERRABORT(system.comm().get(),ierr);
105 
106  // Tell the PetscSection about all of our System variables
107  ierr = PetscSectionSetNumFields(section,system.n_vars());
108  CHKERRABORT(system.comm().get(),ierr);
109 
110  // Set the actual names of all the field variables
111  for( unsigned int v = 0; v < system.n_vars(); v++ )
112  {
113  ierr = PetscSectionSetFieldName( section, v, system.variable_name(v).c_str() );
114  CHKERRABORT(system.comm().get(),ierr);
115  }
116 
117  // For building the section, we need to create local-to-global map
118  // of local "point" ids to the libMesh global id of that point.
119  // A "point" in PETSc nomenclature is a geometric object that can have
120  // dofs associated with it, e.g. Node, Edge, Face, Elem.
121  // The numbering PETSc expects is continuous for the local numbering.
122  // Since we're only using this interface for solvers, then we can just
123  // assign whatever local id to any of the global ids. But it is local
124  // so we don't need to worry about processor numbering for the local
125  // point ids.
126  std::unordered_map<dof_id_type,dof_id_type> node_map;
127  std::unordered_map<dof_id_type,dof_id_type> elem_map;
128  std::map<dof_id_type,unsigned int> scalar_map;
129 
130  // First we tell the PetscSection about all of our points that have
131  // dofs associated with them.
132  this->set_point_range_in_section(system, section, node_map, elem_map, scalar_map);
133 
134  // Now we can build up the dofs per "point" in the PetscSection
135  this->add_dofs_to_section(system, section, node_map, elem_map, scalar_map);
136 
137  // Final setup of PetscSection
138  // Until Matt Knepley finishes implementing the commented out function
139  // below, the PetscSection will be assuming node-major ordering
140  // so let's throw and error if the user tries to use this without
141  // node-major order
142  if (!libMesh::on_command_line("--node-major-dofs"))
143  libmesh_error_msg("ERROR: Must use --node-major-dofs with PetscSection!");
144 
145  //else if (!system.identify_variable_groups())
146  // ierr = PetscSectionSetUseFieldOffsets(section,PETSC_TRUE);CHKERRABORT(system.comm().get(),ierr);
147  //else
148  // {
149  // std::string msg = "ERROR: Only node-major or var-major ordering supported for PetscSection!\n";
150  // msg += " var-group-major ordering not supported!\n";
151  // msg += " Must use --node-major-dofs or set System::identify_variable_groups() = false!\n";
152  // libmesh_error_msg(msg);
153  // }
154 
155  ierr = PetscSectionSetUp(section);CHKERRABORT(system.comm().get(),ierr);
156 
157  // Sanity checking at least that local_n_dofs match
158  libmesh_assert_equal_to(system.n_local_dofs(),this->check_section_n_dofs(system, section));
159 
160  STOP_LOG ("build_section()", "PetscDMWrapper");
161 }
void set_point_range_in_section(const System &system, PetscSection &section, std::unordered_map< dof_id_type, dof_id_type > &node_map, std::unordered_map< dof_id_type, dof_id_type > &elem_map, std::map< dof_id_type, unsigned int > &scalar_map)
Helper function for build_section.
dof_id_type check_section_n_dofs(const System &system, PetscSection &section)
Helper function to sanity check PetscSection construction.
void add_dofs_to_section(const System &system, PetscSection &section, const std::unordered_map< dof_id_type, dof_id_type > &node_map, const std::unordered_map< dof_id_type, dof_id_type > &elem_map, const std::map< dof_id_type, unsigned int > &scalar_map)
Helper function for build_section.
PetscErrorCode ierr
bool on_command_line(std::string arg)
Definition: libmesh.C:876

◆ build_sf()

void libMesh::PetscDMWrapper::build_sf ( const System system,
PetscSF &  star_forest 
)
private

Takes System, empty PetscSF and populates the PetscSF.

The PetscSF (star forest) is a cousin of PetscSection. PetscSection has the DoF info, and PetscSF gives the parallel distribution of the DoF info. So PetscSF should only be necessary when we have more than one MPI rank. Essentially, we are copying the DofMap.send_list(): we are specifying the local dofs, what rank communicates that dof info (for off-processor dofs that are communicated) and the dofs local index on that rank.

https://jedbrown.org/files/StarForest.pdf

Definition at line 163 of file petsc_dm_wrapper.C.

References libMesh::ParallelObject::comm(), libMesh::DofMap::dof_owner(), libMesh::DofMap::first_dof(), libMesh::Parallel::Communicator::get(), libMesh::System::get_dof_map(), libMesh::DofMap::get_send_list(), ierr, libMesh::DofMap::n_local_dofs(), and PETSC_COPY_VALUES.

Referenced by init_and_attach_petscdm().

164 {
165  START_LOG ("build_sf()", "PetscDMWrapper");
166 
167  const DofMap & dof_map = system.get_dof_map();
168 
169  const std::vector<dof_id_type> & send_list = dof_map.get_send_list();
170 
171  // Number of ghost dofs that send information to this processor
172  const PetscInt n_leaves = cast_int<PetscInt>(send_list.size());
173 
174  // Number of local dofs, including ghosts dofs
175  const PetscInt n_roots = dof_map.n_local_dofs() + n_leaves;
176 
177  // This is the vector of dof indices coming from other processors
178  // We need to give this to the PetscSF
179  // We'll be extra paranoid about this ugly double cast
180  static_assert(sizeof(PetscInt) == sizeof(dof_id_type),"PetscInt is not a dof_id_type!");
181  PetscInt * local_dofs = reinterpret_cast<PetscInt *>(const_cast<dof_id_type *>(send_list.data()));
182 
183  // This is the vector of PetscSFNode's for the local_dofs.
184  // For each entry in local_dof, we have to supply the rank from which
185  // that dof stems and its local index on that rank.
186  // PETSc documentation here:
187  // http://www.mcs.anl.gov/petsc/petsc-current/docs/manualpages/PetscSF/PetscSFNode.html
188  std::vector<PetscSFNode> sf_nodes(send_list.size());
189 
190  for( unsigned int i = 0; i < send_list.size(); i++ )
191  {
192  dof_id_type incoming_dof = send_list[i];
193 
194  const processor_id_type rank = dof_map.dof_owner(incoming_dof);
195 
196  // Dofs are sorted and continuous on the processor so local index
197  // is counted up from the first dof on the processor.
198  PetscInt index = incoming_dof - dof_map.first_dof(rank);
199 
200  sf_nodes[i].rank = rank; /* Rank of owner */
201  sf_nodes[i].index = index;/* Index of dof on rank */
202  }
203 
204  PetscSFNode * remote_dofs = sf_nodes.data();
205 
206  PetscErrorCode ierr;
207  ierr = PetscSFCreate(system.comm().get(), &star_forest);CHKERRABORT(system.comm().get(),ierr);
208 
209  // TODO: We should create pointers to arrays so we don't have to copy
210  // and then can use PETSC_OWN_POINTER where PETSc will take ownership
211  // and delete the memory for us. But then we'd have to use PetscMalloc.
212  ierr = PetscSFSetGraph(star_forest,
213  n_roots,
214  n_leaves,
215  local_dofs,
217  remote_dofs,
219  CHKERRABORT(system.comm().get(),ierr);
220 
221  STOP_LOG ("build_sf()", "PetscDMWrapper");
222 }
uint8_t processor_id_type
Definition: id_types.h:99
PetscErrorCode ierr
uint8_t dof_id_type
Definition: id_types.h:64

◆ check_section_n_dofs()

dof_id_type libMesh::PetscDMWrapper::check_section_n_dofs ( const System system,
PetscSection &  section 
)
private

Helper function to sanity check PetscSection construction.

The PetscSection contains local dof information. This helper function just facilitates sanity checking that in fact it only has n_local_dofs.

Definition at line 440 of file petsc_dm_wrapper.C.

References libMesh::ParallelObject::comm(), libMesh::Parallel::Communicator::get(), and ierr.

Referenced by build_section().

441 {
442  PetscInt n_local_dofs = 0;
443 
444  // Grap the starting and ending points from the section
445  PetscInt pstart, pend;
446  PetscErrorCode ierr = PetscSectionGetChart(section, &pstart, &pend);
447  CHKERRABORT(system.comm().get(),ierr);
448 
449  // Count up the n_dofs for each point from the section
450  for( PetscInt p = pstart; p < pend; p++ )
451  {
452  PetscInt n_dofs;
453  ierr = PetscSectionGetDof(section,p,&n_dofs);CHKERRABORT(system.comm().get(),ierr);
454  n_local_dofs += n_dofs;
455  }
456 
457  static_assert(sizeof(PetscInt) == sizeof(dof_id_type),"PetscInt is not a dof_id_type!");
458  return n_local_dofs;
459 }
PetscErrorCode ierr
uint8_t dof_id_type
Definition: id_types.h:64

◆ clear()

void libMesh::PetscDMWrapper::clear ( )

Destroys and clears all build DM-related data.

Definition at line 41 of file petsc_dm_wrapper.C.

References _dms, _sections, and _star_forests.

Referenced by libMesh::PetscDiffSolver::clear(), and ~PetscDMWrapper().

42 {
43  // This will also Destroy the attached PetscSection and PetscSF as well
44  // Destroy doesn't free the memory, but just resets points internally
45  // in the struct, so we'd still need to wipe out the memory on our side
46  for( auto dm_it = _dms.begin(); dm_it < _dms.end(); ++dm_it )
47  DMDestroy( dm_it->get() );
48 
49  _dms.clear();
50  _sections.clear();
51  _star_forests.clear();
52 }
std::vector< std::unique_ptr< PetscSF > > _star_forests
Vector of star forests for all grid levels.
std::vector< std::unique_ptr< PetscSection > > _sections
Vector of PETScSections for all grid levels.
std::vector< std::unique_ptr< DM > > _dms
Vector of DMs for all grid levels.

◆ get_dm()

DM& libMesh::PetscDMWrapper::get_dm ( unsigned int  level)
inlineprivate

Get reference to DM for the given mesh level.

init_dm_data() should be called before this function.

Definition at line 82 of file petsc_dm_wrapper.h.

References _dms.

Referenced by init_and_attach_petscdm().

83  { libmesh_assert(level < _dms.size());
84  return *(_dms[level].get()); }
std::vector< std::unique_ptr< DM > > _dms
Vector of DMs for all grid levels.

◆ get_section()

PetscSection& libMesh::PetscDMWrapper::get_section ( unsigned int  level)
inlineprivate

Get reference to PetscSection for the given mesh level.

init_dm_data() should be called before this function.

Definition at line 90 of file petsc_dm_wrapper.h.

References _sections.

Referenced by init_and_attach_petscdm().

91  { libmesh_assert(level < _sections.size());
92  return *(_sections[level].get()); }
std::vector< std::unique_ptr< PetscSection > > _sections
Vector of PETScSections for all grid levels.

◆ get_star_forest()

PetscSF& libMesh::PetscDMWrapper::get_star_forest ( unsigned int  level)
inlineprivate

Get reference to PetscSF for the given mesh level.

init_dm_data() should be called before this function.

Definition at line 98 of file petsc_dm_wrapper.h.

References _star_forests.

Referenced by init_and_attach_petscdm().

99  { libmesh_assert(level < _star_forests.size());
100  return *(_star_forests[level].get()); }
std::vector< std::unique_ptr< PetscSF > > _star_forests
Vector of star forests for all grid levels.

◆ init_and_attach_petscdm()

void libMesh::PetscDMWrapper::init_and_attach_petscdm ( System system,
SNES &  snes 
)

Definition at line 54 of file petsc_dm_wrapper.C.

References build_section(), build_sf(), libMesh::ParallelObject::comm(), libMesh::Parallel::Communicator::get(), get_dm(), get_section(), get_star_forest(), ierr, init_dm_data(), libMesh::MeshTools::n_levels(), and libMesh::ParallelObject::n_processors().

Referenced by libMesh::PetscDiffSolver::setup_petsc_data().

55 {
56  START_LOG ("init_and_attach_petscdm()", "PetscDMWrapper");
57 
58  PetscErrorCode ierr;
59 
60  // Eventually, we'll traverse the mesh hierarchy and cache
61  // for each grid level, but for now, we're just using the
62  // finest level
63  unsigned int n_levels = 1;
64  this->init_dm_data(n_levels);
65 
66  for(unsigned int level = 0; level < n_levels; level++)
67  {
68  DM & dm = this->get_dm(level);
69  PetscSection & section = this->get_section(level);
70  PetscSF & star_forest = this->get_star_forest(level);
71 
72  ierr = DMShellCreate(system.comm().get(), &dm);
73  CHKERRABORT(system.comm().get(),ierr);
74 
75  // Build the PetscSection and attach it to the DM
76  this->build_section(system, section);
77  ierr = DMSetDefaultSection(dm, section);
78  CHKERRABORT(system.comm().get(),ierr);
79 
80  // We only need to build the star forest if we're in a parallel environment
81  if (system.n_processors() > 1)
82  {
83  // Build the PetscSF and attach it to the DM
84  this->build_sf(system, star_forest);
85  ierr = DMSetDefaultSF(dm, star_forest);
86  CHKERRABORT(system.comm().get(),ierr);
87  }
88  }
89 
90  // We need to set only the finest level DM in the SNES
91  DM & dm = this->get_dm(0);
92  ierr = SNESSetDM(snes, dm);
93  CHKERRABORT(system.comm().get(),ierr);
94 
95  STOP_LOG ("init_and_attach_petscdm()", "PetscDMWrapper");
96 }
DM & get_dm(unsigned int level)
Get reference to DM for the given mesh level.
unsigned int n_levels(const MeshBase &mesh)
Definition: mesh_tools.C:653
PetscSection & get_section(unsigned int level)
Get reference to PetscSection for the given mesh level.
PetscSF & get_star_forest(unsigned int level)
Get reference to PetscSF for the given mesh level.
PetscErrorCode ierr
void init_dm_data(unsigned int n_levels)
Init all the n_mesh_level dependent data structures.
void build_section(const System &system, PetscSection &section)
Takes System, empty PetscSection and populates the PetscSection.
void build_sf(const System &system, PetscSF &star_forest)
Takes System, empty PetscSF and populates the PetscSF.

◆ init_dm_data()

void libMesh::PetscDMWrapper::init_dm_data ( unsigned int  n_levels)
private

Init all the n_mesh_level dependent data structures.

Definition at line 461 of file petsc_dm_wrapper.C.

References _dms, _sections, _star_forests, and libMesh::MeshTools::n_levels().

Referenced by init_and_attach_petscdm().

462 {
463  _dms.resize(n_levels);
464  _sections.resize(n_levels);
465  _star_forests.resize(n_levels);
466 
467  for( unsigned int i = 0; i < n_levels; i++ )
468  {
469  _dms[i] = libmesh_make_unique<DM>();
470  _sections[i] = libmesh_make_unique<PetscSection>();
471  _star_forests[i] = libmesh_make_unique<PetscSF>();
472  }
473 }
std::vector< std::unique_ptr< PetscSF > > _star_forests
Vector of star forests for all grid levels.
std::vector< std::unique_ptr< PetscSection > > _sections
Vector of PETScSections for all grid levels.
unsigned int n_levels(const MeshBase &mesh)
Definition: mesh_tools.C:653
std::vector< std::unique_ptr< DM > > _dms
Vector of DMs for all grid levels.

◆ set_point_range_in_section()

void libMesh::PetscDMWrapper::set_point_range_in_section ( const System system,
PetscSection &  section,
std::unordered_map< dof_id_type, dof_id_type > &  node_map,
std::unordered_map< dof_id_type, dof_id_type > &  elem_map,
std::map< dof_id_type, unsigned int > &  scalar_map 
)
private

Helper function for build_section.

This function will count how many "points" on the current processor have DoFs associated with them and give that count to PETSc. We need to cache a mapping between the global node id and our local count that we do in this function because we will need the local number again in the add_dofs_to_section function.

Definition at line 224 of file petsc_dm_wrapper.C.

References libMesh::ParallelObject::comm(), libMesh::DofMap::dof_indices(), libMesh::FEType::family, libMesh::Parallel::Communicator::get(), libMesh::System::get_dof_map(), libMesh::System::get_mesh(), libMesh::DofObject::id(), ierr, libMesh::DofMap::local_index(), mesh, libMesh::DofMap::n_local_dofs(), libMesh::ParallelObject::n_processors(), libMesh::DofMap::n_SCALAR_dofs(), libMesh::System::n_vars(), libMesh::System::number(), libMesh::ParallelObject::processor_id(), libMesh::SCALAR, libMesh::Variable::type(), and libMesh::System::variable().

Referenced by build_section().

229 {
230  // We're expecting this to be empty coming in
231  libmesh_assert(node_map.empty());
232 
233  // We need to count up the number of active "points" on this processor.
234  // Nominally, a "point" in PETSc parlance is a geometric object that can
235  // hold DoFs, i.e node, edge, face, elem. Since we handle the mesh and are only
236  // interested in solvers, then the only thing PETSc needs is a unique *local* number
237  // for each "point" that has active DoFs; note however this local numbering
238  // we construct must be continuous.
239  //
240  // In libMesh, for most finite elements, we just associate those DoFs with the
241  // geometric nodes. So can we loop over the nodes on this processor and check
242  // if any of the fields are have active DoFs on that node.
243  // If so, then we tell PETSc about that "point". At this stage, we just need
244  // to count up how many active "points" we have and cache the local number to global id
245  // mapping.
246 
247 
248  // These will be our local counters. pstart should always be zero.
249  // pend will track our local "point" count.
250  // If we're on a processor who coarsened the mesh to have no local elements,
251  // we should make an empty PetscSection. An empty PetscSection is specified
252  // by passing [0,0) to the PetscSectionSetChart call at the end. So, if we
253  // have nothing on this processor, these are the correct values to pass to
254  // PETSc.
255  dof_id_type pstart = 0;
256  dof_id_type pend = 0;
257 
258  const MeshBase & mesh = system.get_mesh();
259 
260  const DofMap & dof_map = system.get_dof_map();
261 
262  // If we don't have any local dofs, then there's nothing to tell to the PetscSection
263  if (dof_map.n_local_dofs() > 0)
264  {
265  // Conservative estimate of space needed so we don't thrash
266  node_map.reserve(mesh.n_local_nodes());
267  elem_map.reserve(mesh.n_active_local_elem());
268 
269  // We loop over active elements and then cache the global/local node mapping to make sure
270  // we only count active nodes. For example, if we're calling this function and we're
271  // not the finest level in the Mesh tree, we don't want to include nodes of child elements
272  // that aren't active on this level.
273  for (const auto & elem : mesh.active_local_element_ptr_range())
274  {
275  for (unsigned int n = 0; n < elem->n_nodes(); n++)
276  {
277  // get the global id number of local node n
278  const Node & node = elem->node_ref(n);
279 
280  // Only register nodes with the PetscSection if they have dofs that belong to
281  // this processor. Even though we're active local elements, the dofs associated
282  // with the node may belong to a different processor. The processor who owns
283  // those dofs will register that node with the PetscSection on that processor.
284  std::vector<dof_id_type> node_dof_indices;
285  dof_map.dof_indices( &node, node_dof_indices );
286  if( !node_dof_indices.empty() && dof_map.local_index(node_dof_indices[0]) )
287  {
288 #ifndef NDEBUG
289  // We're assuming that if the first dof for this node belongs to this processor,
290  // then all of them do.
291  for( auto dof : node_dof_indices )
292  libmesh_assert(dof_map.local_index(dof));
293 #endif
294  // Cache the global/local mapping if we haven't already
295  // Then increment our local count
296  dof_id_type node_id = node.id();
297  if( node_map.count(node_id) == 0 )
298  {
299  node_map.insert(std::make_pair(node_id,pend));
300  pend++;
301  }
302  }
303  }
304 
305  // Some finite elements, e.g. Hierarchic, associate element interior DoFs with the element
306  // rather than the node (since we ought to be able to use Hierachic elements on a QUAD4,
307  // which has no interior node). Thus, we also need to check element interiors for DoFs
308  // as well and, if the finite element has them, we also need to count the Elem in our
309  // "point" accounting.
310  if( elem->n_dofs(system.number()) > 0 )
311  {
312  dof_id_type elem_id = elem->id();
313  elem_map.insert(std::make_pair(elem_id,pend));
314  pend++;
315  }
316  }
317 
318  // SCALAR dofs live on the "last" processor, so only work there if there are any
319  if( dof_map.n_SCALAR_dofs() > 0 && (system.processor_id() == (system.n_processors()-1)) )
320  {
321  // Loop through all the variables and cache the scalar ones. We cache the
322  // SCALAR variable index along with the local point to make it easier when
323  // we have to register dofs with the PetscSection
324  for( unsigned int v = 0; v < system.n_vars(); v++ )
325  {
326  if( system.variable(v).type().family == SCALAR )
327  {
328  scalar_map.insert(std::make_pair(pend,v));
329  pend++;
330  }
331  }
332  }
333 
334  }
335 
336  PetscErrorCode ierr = PetscSectionSetChart(section, pstart, pend);
337  CHKERRABORT(system.comm().get(),ierr);
338 }
MeshBase & mesh
PetscErrorCode ierr
uint8_t dof_id_type
Definition: id_types.h:64

Member Data Documentation

◆ _dms

std::vector<std::unique_ptr<DM> > libMesh::PetscDMWrapper::_dms
private

Vector of DMs for all grid levels.

Definition at line 67 of file petsc_dm_wrapper.h.

Referenced by clear(), get_dm(), and init_dm_data().

◆ _sections

std::vector<std::unique_ptr<PetscSection> > libMesh::PetscDMWrapper::_sections
private

Vector of PETScSections for all grid levels.

Definition at line 70 of file petsc_dm_wrapper.h.

Referenced by clear(), get_section(), and init_dm_data().

◆ _star_forests

std::vector<std::unique_ptr<PetscSF> > libMesh::PetscDMWrapper::_star_forests
private

Vector of star forests for all grid levels.

Definition at line 73 of file petsc_dm_wrapper.h.

Referenced by clear(), get_star_forest(), and init_dm_data().


The documentation for this class was generated from the following files: